Helium + MEMS Oscillators = iOS device failure?

A system administrator noticed that all iOS devices in the medical facility he had been working at stopped working after a recent installation of an MRI machine. Turns out this was not related to any electromagnetic interference, but rather helium of all things. Below is one of the most interesting things I have ever read about electronics:

Original Post: https://www.reddit.com/r/sysadmin/comments/9mk2o7/mri_disabled_every_ios_device_in_facility/

Follow-up with some interesting comments from other people who have had strange bugs that were also hard to track down: https://www.reddit.com/r/sysadmin/comments/9si6r9/postmortem_mri_disables_every_ios_device_in/

The tl;dr is that there was a helium leak during the setup of the MRI. MRIs use very large, powerful magnets and liquid helium is used to cool them. The coolant was leaking which spread into the HVAC system in the facility. Many iOS devices use MEMS oscillators to generate a clock signal rather than using quartz. This is for packaging and cost saving reasons since they are smaller (thinner) and cheaper than a standard quartz oscillator. Quartz oscillators are hermetically sealed in metal to prevent interference but MEMS oscillators are usually sealed in plastic. Helium can permeate through plastic leading to the interference with clock signal, and since the clock signal is vital to numerous parts of an IC, the iOS devices stopped working. I do recommend reading the entire post though, it’s super interesting.

 

Advertisements

This week in tech: Bloomberg’s hit piece on Supermicro, iPhone Xr reviews surface, and new Macs coming next week

First off, here is the Bloomberg article that claimed Supermicro’s hardware had been compromised during manufacturing in China:

https://www.bloomberg.com/news/articles/2018-10-09/new-evidence-of-hacked-supermicro-hardware-found-in-u-s-telecom

Companies that utilize servers with motherboards manufactured by California-based Supermicro have all come out stating that the Bloomberg article was hogwash and that there is no evidence of any compromised hardware or software on the supposedly affected servers. This includes tech giants such as Amazon and Apple who would definitely have a lot to loose if they were trying to cover this up. It’s clear that Bloomberg is full of it and should retract the story since the evidence is clearly not there. Should be interesting to see how this plays out legally since the story sent Supermicro’s shares plummeting. Looks like I won’t be reading anything from Bloomberg anymore.

 

Secondly, the iPhone Xr has arrived in the hands of tech reviewers and the first reviews are surfacing. Here are some substantial ones by Nilay Patel at The Verge and John Gruber:

https://daringfireball.net/2018/10/the_iphone_xr

https://www.theverge.com/2018/10/23/18011306/apple-iphone-xr-review-camera-screen-battery-price

I have some strong opinions regarding the 326 ppi LCD screen on a $750 phone but then again I haven’t seen it in person so I can’t judge just yet.

 

Finally, we have some new Macs coming! I am very excited to see a new Mac mini as well as a nice refresh for the rest of the lineup.

https://appleinsider.com/articles/18/10/24/four-new-macs-spotted-in-eurasian-regulatory-filings

 

 

 

Xamarin Tips and Tricks – Re-generating the Mono GAC after updating .NET SDK

In my current work environment we use Babel for .NET to obfuscate our Xamarin binaries. The nice thing about Babel is that it’s the only Xamarin obfuscator that runs directly in macOS, eliminating the need for a Windows machine just for release builds. Things were working fine until after I installed the latest Visual Studio for Mac update along with the latest version of Xcode, Xamarin SDK, and .NET SDK. Turns out that Babel got removed from Mono’s Global Assembly Cache which is basically where DLLs (like Babel’s build DLL) get tracked so you don’t need to use an absolute path to the DLL to reference it. Adding the DLL back via gacutil solved it (if you have Babel the gacutil command syntax was included with the zip they send you). I am guessing you’ll need to do this every time there is a major update to the .NET SDK or Mono framework. Thanks to Alberto from Babel for the solution.

Fixing things broken by macOS Mojave

I haven’t had enough time to fully play around with Mojave so I can’t write a full review yet. Things have already broken (as expected) and I will be keeping this list updated as I find and fix them:

Subpixel antialiasing workaround (have yet to try this out myself): https://www.reddit.com/r/apple/comments/9inu3e/if_the_font_rendering_on_mojave_looks_odd_to_you/

Missing Safari Extensions (such as uBlock Origin): Once starting up Safari after updating to Mojave you will notice that your extensions are gone. You will also notice trying to install extensions from Preferences > Extensions redirects you to the Mac App Store since Safari extensions are now hosted there. You can still install extensions from the Safari Extensions Gallery website here: https://safari-extensions.apple.com/?category=mostpopular

Recently opened apps added to Dock: You will notice that apps that you haven’t pinned to the Dock are still there even after you close them. This is because the option “Show recent applications in Dock” is now enabled by default. You can turn it off under System Preferences > Dock > Show recent applications in Dock.

MITM Attacks in Mobile Apps

A Man-in-the-middle or MITM attack is a common technique used to steal data during transport to and from a web service and a mobile app. An easy way to see what sort of data your app is leaking is to use a MITM proxy server such as the Python based mitmproxy. Today I’m going to show a quick demo using a small app that sends and receives JSON payloads with mitmproxy capturing all that data. If you want to try this out yourself you can use your own app or feel free to use mine:

Requirements:

-A mobile app to test with (the one used in the video is available here: https://github.com/ShravanJ/MITMAppDemo/)

mitmproxy

-A web service endpoint to deliver a JSON payload (I used json-server for testing)

You will need to setup mitmproxy to work with your phone with the following steps. Once you have that setup you can test how your app handles web service requests whether it be JSON, SOAP, or loading images and videos. This should give you some insight on how easy it is to see what your app is doing when communicating with a web service. One way to help prevent this is to implement Certificate Pinning. I have provided some implementation guides below:

For Xamarin apps:

https://github.com/chrisriesgo/xamarin-cert-pinning

https://thomasbandt.com/certificate-and-public-key-pinning-with-xamarin

For native iOS apps:

https://github.com/datatheorem/TrustKit

https://www.bugsee.com/blog/ssl-certificate-pinning-on-ios-using-trustkit/

I will probably be doing another video with a web service running over HTTPS and show how Certificate Pinning can help stop MITM attacks. And as usual, thanks for reading.

The (super short) iMac Pro Review

I recently upgraded my work Mac mini (Late 2014) to the base iMac Pro. After spending about a week working with it I am pleased to say that it has surpassed all expectations I had for it.

iMacPro

DESIGN

The iMac Pro is the first iMac to be finished in the modern Apple “Space Grey”. This, along with the 27″ 5K panel, make for a striking combination. The included space grey accessories and a black Lightning cable are a nice touch. Overall the iMac Pro has a great desk presence for any setting. This is the first iMac I have ever used an the compactness of having a screen and processing hardware all-in-one is great for saving desk space and general aesthetic.

PERFORMANCE

Here is why I really went for the iMac Pro: the 8 core Xeon-W processor matched with 32 gigabytes of ECC DDR4 RAM and 1 TB NVMe based storage (which is actually dual 512 GB drives running in RAID 0). The performance is simply staggering, with apps opening basically instantly. The biggest difference I saw was in Xamarin.Forms app build times which are cut in half in most of my workloads. Compiling, uploading, and debugging are noticeably faster and smoother compared to the maxed out Mac mini I was using previously. I haven’t tested out native Xcode project build times just yet, but I am guessing it will be halved as well. The only issue I have had so far, and this isn’t really related to CPU performance, but viewing a 5K display over VNC through a VPN was terrible through the built in VNC server. I tried RealVNC as I use on the other development Macs but the latest licensing model has switched to a yearly subscription so I had to look elsewhere. I ended up settling with OSXvnc (https://github.com/stweil/OSXvnc) which performs somewhere in between the built-in VNC server and RealVNC. Setup was pretty easy, only thing I had to do manually was set it to launch at login via System Preferences.

SUMMARY

Overall the iMac Pro, even with it’s base config, is a fantastic development machine. It is a massive step up from the late 2014 Mac mini and most consumer grade Macs in general. If you can justify the $5,000 price tag it is totally worth it.

 

How to use a piezoelectric buzzer with ARM based Arduino compatibles

I recently had to integrate a basic passive piezoelectric buzzer into a project utilizing the Adafruit Bluefruit Feather nRF52, which is an Arduino IDE compatible development board that is based on the Nordic Semiconductor nRF52832 SoC containing an ARM Cortex-M4F processor. Upon googling how to use a piezo buzzer with Arduino all guides pointed towards using the build in tone() library which should do the trick. But there is only one problem, there is currently no native support for ARM based controllers due to some changes in timings that would need to be made from the AVR compatible version. The solution is simple, just use basic PWM to make the buzzer buzz. Here is a wiring diagram to get it working:

buzzer

Just hookup the positive side of the buzzer to any PWM supported header and the negative to ground. In this case I have it connected to A4 which translates to digital output 28 according to this pinout:

nRF52Pinout

Now that we have the wiring done, we need to write a program to address the buzzer. This requires the Arduino IDE of course along with the correct BSP installed (check Adafruit’s website for the BSP install info for this particular board). Now for the program itself.

This solution allows you to enter in the duration of the buzz into the Serial Monitor and the piezo buzzer will buzz for that allotment of time. It uses digitalWrite() to send an alternating HIGH and LOW signal to the buzzer with a 1000 microsecond delay between modulation. Changing the delay will alter the pitch of the buzz, with lower delays providing a more high pitched sound and higher delays providing a lower pitch. Feel free to change the delays to match your desired pitch. This quick and simple solution will pretty much work with all Arduino compatibles that support digitalWrite().

Intel RealSense D435: Intel’s answer to Kinect?

Today I placed a pre-order in for the Intel RealSense D435, a stereoscopic depth sensing camera that is the new flagship device from the Intel RealSense family. You may already be using a RealSense product in certain Ultrabooks since the RealSense modules are used for Windows Hello. The latest RealSense D400 class cameras feature an all new image and depth processors as well as stereo depth cameras. This is what really sets it apart from the Kinect, which just uses a single depth sensor paired with an active IR projector to improve depth data. Now with two depth cameras, you can get a wider FOV and still maintain acceptable resolution. The specs on paper are really quiet impressive, mainly regarding the resolution and FPS of the depth sensor. The D435 can gather depth data at a resolution of 1280 x 720 @ 90 FPS, which makes the Kinect v2’s depth data capture of 512 x 524 @ 30 FPS look pretty basic. Then again, the Kinect v2 was launched in 2013, so I expect Intel’s latest hardware to be better. Hardware aside, the D435 looks to be a worthy successor to the Kinect, but for my use case I care more about the software. The project I worked on last summer relied solely on the Kinect’s native skeletal tracking functionality in the Kinect for Windows SDK. Without that our time to market would have been greatly diminished if we took a more object tracking based approach to our application. We have continued to rely on the body tracking for other projects as well, so body tracking in our next camera is also a must. The Intel RealSense 2016 SDK did contain preview components of body tracking, but that is only limited to older RealSense cameras. Sadly, the RealSense SDK 2.0 which the D435 requires does not include any body tracking functionality. A company by the name of 3DIVI claims to have the solution with their NuiTrack SDK, which offers Kinect-like body tracking functionality with competing depth sensing cameras such as the Orbbec Astra. The website claims that Intel RealSense support is coming soon. Apparently Microsoft is referring Kinect customers to go with Intel RealSense for body tracking and my best bet is that Intel will have some sort of deal to work with NuiTrack. I have no idea if there is going to be any special licensing for RealSense customers or if we will have to pay the same licensing fee as someone who is using say the Orbbec Astra. We will just have to wait and see. According to my confirmation email, the D435 should ship out within 6 weeks, I’m hoping it comes much before then. So far my experience with the Orbbec Astra, a camera that we evaluated as a replacement for the Kinect even before Microsoft announced the discontinuation, has not been so great. The hardware doesn’t seem too bad, but the software is really what killed it for me. The current body tracking SDK, while in beta, is nowhere near that of the Kinect or even NuiTrack. The example program would often mistake my Herman Miller Aeron chair for a person and offered very poor tracking in poses and positions that are relevant to our application. There development pace has been picking up but is still pretty slow. I am not very likely to continue to pursue the Orbbec route and instead plan on sticking with RealSense along with NuiTrack. The combo offers better hardware and software than Orbbec and their own home grown SDK solution. Still, this will basically be a completely new platform overhaul, mainly due to the fact that we had a lot of .NET conveniences when developing with Kinect, and NuiTrack is based in C++. I am still learning C++, so jumping straight into a project involving sophisticated depth sensing equipment and interfacing with other peripherals will be quite a challenge. Then again, I do like these sort of challenges. I’ll post more about the D435 when I receive it as well as a deeper dive into the NuiTrack SDK once we buy a license. Stay tuned.