I have started a new blog dedicated to cars and the auto industry: https://thetorqueconverter.wordpress.com
A Man-in-the-middle or MITM attack is a common technique used to steal data during transport to and from a web service and a mobile app. An easy way to see what sort of data your app is leaking is to use a MITM proxy server such as the Python based mitmproxy. Today I’m going to show a quick demo using a small app that sends and receives JSON payloads with mitmproxy capturing all that data. If you want to try this out yourself you can use your own app or feel free to use mine:
-A mobile app to test with (the one used in the video is available here: https://github.com/ShravanJ/MITMAppDemo/)
-A web service endpoint to deliver a JSON payload (I used json-server for testing)
You will need to setup mitmproxy to work with your phone with the following steps. Once you have that setup you can test how your app handles web service requests whether it be JSON, SOAP, or loading images and videos. This should give you some insight on how easy it is to see what your app is doing when communicating with a web service. One way to help prevent this is to implement Certificate Pinning. I have provided some implementation guides below:
For Xamarin apps:
For native iOS apps:
I recently upgraded my work Mac mini (Late 2014) to the base iMac Pro. After spending about a week working with it I am pleased to say that it has surpassed all expectations I had for it.
The iMac Pro is the first iMac to be finished in the modern Apple “Space Grey”. This, along with the 27″ 5K panel, make for a striking combination. The included space grey accessories and a black Lightning cable are a nice touch. Overall the iMac Pro has a great desk presence for any setting. This is the first iMac I have ever used an the compactness of having a screen and processing hardware all-in-one is great for saving desk space and general aesthetic.
Here is why I really went for the iMac Pro: the 8 core Xeon-W processor matched with 32 gigabytes of ECC DDR4 RAM and 1 TB NVMe based storage (which is actually dual 512 GB drives running in RAID 0). The performance is simply staggering, with apps opening basically instantly. The biggest difference I saw was in Xamarin.Forms app build times which are cut in half in most of my workloads. Compiling, uploading, and debugging are noticeably faster and smoother compared to the maxed out Mac mini I was using previously. I haven’t tested out native Xcode project build times just yet, but I am guessing it will be halved as well. The only issue I have had so far, and this isn’t really related to CPU performance, but viewing a 5K display over VNC through a VPN was terrible through the built in VNC server. I tried RealVNC as I use on the other development Macs but the latest licensing model has switched to a yearly subscription so I had to look elsewhere. I ended up settling with OSXvnc (https://github.com/stweil/OSXvnc) which performs somewhere in between the built-in VNC server and RealVNC. Setup was pretty easy, only thing I had to do manually was set it to launch at login via System Preferences.
Overall the iMac Pro, even with it’s base config, is a fantastic development machine. It is a massive step up from the late 2014 Mac mini and most consumer grade Macs in general. If you can justify the $5,000 price tag it is totally worth it.
Looks like I’ll be attending for the first time! See y’all in San Jose!
I recently had to integrate a basic passive piezoelectric buzzer into a project utilizing the Adafruit Bluefruit Feather nRF52, which is an Arduino IDE compatible development board that is based on the Nordic Semiconductor nRF52832 SoC containing an ARM Cortex-M4F processor. Upon googling how to use a piezo buzzer with Arduino all guides pointed towards using the build in tone() library which should do the trick. But there is only one problem, there is currently no native support for ARM based controllers due to some changes in timings that would need to be made from the AVR compatible version. The solution is simple, just use basic PWM to make the buzzer buzz. Here is a wiring diagram to get it working:
Just hookup the positive side of the buzzer to any PWM supported header and the negative to ground. In this case I have it connected to A4 which translates to digital output 28 according to this pinout:
Now that we have the wiring done, we need to write a program to address the buzzer. This requires the Arduino IDE of course along with the correct BSP installed (check Adafruit’s website for the BSP install info for this particular board). Now for the program itself.
This solution allows you to enter in the duration of the buzz into the Serial Monitor and the piezo buzzer will buzz for that allotment of time. It uses digitalWrite() to send an alternating HIGH and LOW signal to the buzzer with a 1000 microsecond delay between modulation. Changing the delay will alter the pitch of the buzz, with lower delays providing a more high pitched sound and higher delays providing a lower pitch. Feel free to change the delays to match your desired pitch. This quick and simple solution will pretty much work with all Arduino compatibles that support digitalWrite().
Today I placed a pre-order in for the Intel RealSense D435, a stereoscopic depth sensing camera that is the new flagship device from the Intel RealSense family. You may already be using a RealSense product in certain Ultrabooks since the RealSense modules are used for Windows Hello. The latest RealSense D400 class cameras feature an all new image and depth processors as well as stereo depth cameras. This is what really sets it apart from the Kinect, which just uses a single depth sensor paired with an active IR projector to improve depth data. Now with two depth cameras, you can get a wider FOV and still maintain acceptable resolution. The specs on paper are really quiet impressive, mainly regarding the resolution and FPS of the depth sensor. The D435 can gather depth data at a resolution of 1280 x 720 @ 90 FPS, which makes the Kinect v2’s depth data capture of 512 x 524 @ 30 FPS look pretty basic. Then again, the Kinect v2 was launched in 2013, so I expect Intel’s latest hardware to be better. Hardware aside, the D435 looks to be a worthy successor to the Kinect, but for my use case I care more about the software. The project I worked on last summer relied solely on the Kinect’s native skeletal tracking functionality in the Kinect for Windows SDK. Without that our time to market would have been greatly diminished if we took a more object tracking based approach to our application. We have continued to rely on the body tracking for other projects as well, so body tracking in our next camera is also a must. The Intel RealSense 2016 SDK did contain preview components of body tracking, but that is only limited to older RealSense cameras. Sadly, the RealSense SDK 2.0 which the D435 requires does not include any body tracking functionality. A company by the name of 3DIVI claims to have the solution with their NuiTrack SDK, which offers Kinect-like body tracking functionality with competing depth sensing cameras such as the Orbbec Astra. The website claims that Intel RealSense support is coming soon. Apparently Microsoft is referring Kinect customers to go with Intel RealSense for body tracking and my best bet is that Intel will have some sort of deal to work with NuiTrack. I have no idea if there is going to be any special licensing for RealSense customers or if we will have to pay the same licensing fee as someone who is using say the Orbbec Astra. We will just have to wait and see. According to my confirmation email, the D435 should ship out within 6 weeks, I’m hoping it comes much before then. So far my experience with the Orbbec Astra, a camera that we evaluated as a replacement for the Kinect even before Microsoft announced the discontinuation, has not been so great. The hardware doesn’t seem too bad, but the software is really what killed it for me. The current body tracking SDK, while in beta, is nowhere near that of the Kinect or even NuiTrack. The example program would often mistake my Herman Miller Aeron chair for a person and offered very poor tracking in poses and positions that are relevant to our application. There development pace has been picking up but is still pretty slow. I am not very likely to continue to pursue the Orbbec route and instead plan on sticking with RealSense along with NuiTrack. The combo offers better hardware and software than Orbbec and their own home grown SDK solution. Still, this will basically be a completely new platform overhaul, mainly due to the fact that we had a lot of .NET conveniences when developing with Kinect, and NuiTrack is based in C++. I am still learning C++, so jumping straight into a project involving sophisticated depth sensing equipment and interfacing with other peripherals will be quite a challenge. Then again, I do like these sort of challenges. I’ll post more about the D435 when I receive it as well as a deeper dive into the NuiTrack SDK once we buy a license. Stay tuned.
The late 2014 Mac mini, unlike all of the other Mac mini’s before it, features soldered on RAM as well as very difficult to access hard drive that is not intended to be user replaceable. Because of this, along with the lack of a quad core i7 option, have led people who wanted Mac minis to go for a used 2012 model. Since I purchased my Mac mini for work, I settled with a late 2014 because I wanted the warranty as well as the newer processor as well as longer support for macOS. I settled on the 2.6Ghz Core i5 with 16GB of RAM and the molasses slow 1TB 5400 RPM hard drive. Since I was spoiled by the performance of the PCIe SSD found in my late 2013 Retina MacBook Pro I didn’t know how slow macOS is on spinning media. It is *really* slow, to the point where my MacBook was a faster development machine. The whole point of buying the Mac mini was so that I didn’t have to dock in my MacBook Pro to my monitors and peripherals every time I needed to get work done. So after months of putting up with molasses slow disk reads and writes I decided to look into a SSD upgrade. After watching a few YouTube videos on replacing the internal HD I realized that not only is it excessively difficult for a hard drive replacement but that there are so many things I could end up breaking along the way. I need my mini for work and I was not so keen on opening up a new $800 computer. So I looked at external drives and saw USB was an option, however there were some drawbacks. USB 3.0 has a max throughput of 5.0Gbps and SATA is 6.0 so I wouldn’t be getting the full bandwidth. UASP compatible SATA to USB connectors promised almost full SATA like performance since it attaches USB over SCSI. This is also supposed to enable TRIM support but from what I read macOS does not allow TRIM over USB. So USB 3.0 was out, so what other high speed connection is there on the Mac mini. Thunderbolt of course! With a 20Gbps link speed, Thunderbolt 2 is still a very fast standard and provides more than enough head room for SATA. I finally came across the AKiTiO Thunder SATA Go, an external Thunderbolt dock that connects SATA to eSATA to Thunderbolt, negotiating a full 6Gbps link speed. Since this is basically like a direct SATA uplink TRIM is natively supported on SSDs. Sweet. Paired with a Samsung 1TB 850 EVO and you have an absolutely killer SSD upgrade for your Mac mini without even opening it up. This convenience does come at a cost however as the Thunder SATA Go is $95, a price you would not have to pay if you just upgraded the disk internally. I think it’s worth it though since I was able to get up and running in about 2 hours after cloning my hard drive to my SSD using SuperDuper!, setting it to my start up disk, and then erasing my spinning hard drive. I now have 1TB of super fast solid state storage and 1TB of bulk spinning storage which is more than I will ever need for a development machine but boy was it worth it. Boot time, app start up time, and overall system responsiveness have increased 10 fold. Feels like a different computer now, finally a true replacement for my MacBook Pro.
Highly recommended security update that everyone running High Sierra needs to install. Patches a bug that allows the creation and authentication of a root user account without a password. If you have automatic updating turned on for security updates, you should have it automatically download and install. Otherwise check the App Store > Updates tab for the security update.
This is yet another blunder by Apple’s macOS engineering team. The software QA is reaching a new low and its really disappointing. So far its not enough to make me switch back to using Windows full time, but if this continues I am definitely going to consider it.
The iPhone X is possibly the most anticipated iPhone since the original iPhone. It represents the most drastic change in the 10 year evolution of the smart phone that took over the world and is helping propel Apple into becoming a 1 trillion dollar company. I had been closely following rumors of this phone once my iPhone 6 had started showing its age last year. Once I knew about the edge to edge display and facial recognition capabilities I knew I had to jump on the hype train and buy it come release day. And here we are, 24 hours after the launch, and I am still damn impressed with the phone.
Apple simply knocked it out of the park as always. I thought my iPhone 4 and 6 were well built but the X is on another level. The finishing and attention to detail is impeccable. The glass back and stainless steel band in “Space Grey” look fantastic. Feels heavy and very high quality, but still relatively comfortable to hold. The way the screen just curves into the band and rest of the body is just perfect. I really can’t say enough about the way the phone looks and feels, you really need to see it for yourself.
The OLED screen on the iPhone X is something really special. It is arguably the best OLED screen you can find on a smartphone right now. According to Apple, although the display is manufactured by Samsung, it was custom designed for the X. It is PenTile, supports HDR10 and Dolby Vision, runs at 60hz but samples at 120hz, and goes from edge to edge of the phone (except for the notch). I can safely say this is the best screen ever put on an iPhone and the best screen I have ever seen on a mobile device. Colors are crisp and the blacks are very deep, with just the right amount of contrast without making it look like a over saturated Galaxy S8 or Note. It gets very bright when you need to use it outside and dims to about the same level as previous iPhone LCD displays when you need to use it in the dark. The only thing that I’d be worried about is burn-in over time which is common with all OLED displays. Apple has said they have used hardware and software to mitigate this but we won’t know for a while. As for now though it really is a great display.
Face ID and the TrueDepth camera system
This is probably my favorite part of the iPhone. Since working with the Kinect over summer I have been interested in depth sensing cameras and getting to see one in an iPhone is very exciting. Using technology pioneered from PrimeSense and perfected over time at Apple, the TrueDepth camera system is an engineering marvel. What used to be found in a device as large as the Kinect now occupies the small notch at the top of the smartphone. The main purpose of this setup is for Face ID, which in my testing has been working very well. I have tested it in darkness, daylight, and with sunglasses all of which work well. It does struggle with certain angles and works best in darkness as some lighting does not play so well. I also found out that it did not work when I had my glasses off, maybe because I trained it while wearing my glasses. It is not perfect but I would say that it is still faster than the Touch ID sensor found in my iPhone 6. Apps that already use Touch ID will work with Face ID, which is a plus. I did notice that apps that have not been updated to prompt for Face ID displayed a message that the app was designed for Touch ID and not Face ID along with the normal prompt asking whether or not you want to let the app use Face ID. Along with Face ID, the TrueDepth camera is also used to Animoji, a feature that I honestly am not that interested in. I tried it, seems cool, but that’s all. If you want to learn more about it, read up on The Verge’s review in which Nilay Patel claims that is the best selling point of the phone.
The A11 Bionic processor
I wasn’t all that amused during the keynote when the processor powering the iPhone was dubbed the A11 Bionic. What a silly name, I mean A10 Fusion sounded cool but Bionic just sounded silly to me. Anyways, the processor packs a serious punch, with synthetic benchmarks such as GeekBench showing Apple’s silicon engineering prowess destroying competing devices like the S8 and Galaxy, benching close to MacBook Pros. In day to day use it is snappy, pretty power efficient based on my usage so far, and a huge upgrade over the A8 found in the iPhone 6. The tear down by iFixit reveals the logic board in which the A11 sits and oh boy it is really something to look at. A true silicon masterpiece that makes you just step back and realize how far the iPhone has come. A 70% decrease in the footprint over the iPhone 7/8 is extremely impressive. From an engineering standpoint it represents a pinnacle in hardware design and packaging, leveraging creative thinking with the latest in fabrication techniques. But then again, this is Apple, so it is expected.
When Apple announced the iPhone X they billed it as the future of the smartphone. That really is a bold claim even coming from Apple but in a way, I think they might be right. Just looking at the density of the logic board and the TrueDepth camera, Apple is moving hardware in a new direction at a new pace. Albeit their innovation in the Mac space has greatly reduced as well as overall software quality, their new focus on iPhone hardware is refreshing since we had to deal with 3 years of the same iPhone 6 design. The original iPhone got a lot of things right, and many of those things are still present in the X. The interface and design may have changed but the fundamental usability is still there. Here is to another 10 years of iPhone. Thanks for reading.
One of the biggest headlines in tech today was that Microsoft is killing of the Xbox Kinect sensor (article here). This is quiet a blow to hackers and enthusiasts who have been using the Kinect for motion capture, 3D scanning, depth mapping, and general computer vision applications. Introduced for the Xbox 360 in 2010 and teased under the codename “Project Natal”, the Kinect was introduced with much fanfare, only to never get any popular games to play it with. Hailed as a useful accessory with the V2 release for the Xbox One, the second generation of Kinect was more powerful, accurate, and capable since you could use it via voice commands to navigate through your Xbox One. But yet again, even with this promising and advanced piece of technology game developers never really got on board and once again there were no show stopping titles available which led to its inevitable death. However, on a technical side, the Kinect will continue to live on. PrimeSense, the Israeli manufacture of the sensor and circuitry used in the original Kinect for Xbox 360 was purchased by Apple in 2013. Their technology can also be found in the ASUS Xtion which is basically a rebranded PrimeSense Carmine camera. They were arguably one of the most influential companies in the development of consumer 3D depth sensing technology, contributing to projects such as OpenNI as well as the sensor technology in general. But after the Apple acquisition, there were no more PrimeSense cameras being made, and coming back to what I said earlier about the Kinect technology living on today, is that the same structured light sensing technology is now being used in the iPhone X for Face ID. The research that led to a video game accessory that never took off is now behind arguably the biggest feature in a device that has been so hyped up and poised for one of the largest preorders of a consumer electronics device ever. It’s really astonishing once you think about it. But it doesn’t stop there, since Microsoft is continuing to push the edge on vision technologies but not with a Xbox accessory, rather a HMD for mixed reality. I’m talking about the HoloLens, which while is still in development and purchasable as development kit only, is the advancement and technological successor to the Kinect. It uses sensor technology that was pioneered by the first two Kinects and continues to build on them while taking a new approach to interaction. I am fairly certain the engineers who worked on Kinect are now all on the HoloLens team (at least I know this guy is), so I think its safe to say the Kinect is dead. As you can see on this HN post, a lot of people are saddened as am I. I worked with the Kinect all summer for my current employer. We are now looking at alternatives going forward, mainly considering the Orbbec Astra, Occipital’s Structure Sensor, and the Stereolabs ZED. As of now, none of these seem to have a mature and large SDK like that of the Kinect, nor do they offer fully integrated and functional skeletal tracking which is our main focus. Orbbec does have a beta for this, however their slow development and release pace is concerning. We’ll see.