You may remember that I installed macOS Sierra on my 2009 HP Pavillion dv6-2000t as referenced by this blog post: https://thestackunderflowblog.wordpress.com/2017/07/07/installing-macos-sierra-on-a-2009-hp-pavilion-laptop/. Since then I had to reinstall Windows 10 for use as a Plex server for a while. A few months later I was able to grab an old SFF Dell desktop with a 3rd gen i5 to take over Plex duties, leading my HP to retirement yet again. At this point I figured I should just recycle the computer since I don’t really have much use for it, until earlier this week. I decided to dig it up to give it one more shot at life, with one more shot at a working macOS install.
My Sierra install was pretty smooth except for one jarring issue: lack of proper CPU management. I don’t think I was getting the full performance out of the first gen Core i7-720QM inside of macOS and I never could figure out why. Actually, I am still not sure if I’m getting the full performance in Mojave but so far it seems faster than my Sierra install so I’ll mark that as a win in my book. Now you might ask, why not Catalina instead of Mojave? Simply put I just couldn’t get the Catalina installer to boot, so maybe my hardware is just too old. I actually like Mojave better since it has less of the annoyances that were introduced in Catalina. Overall the install process was pretty similar, with one headache this time being graphics. With my Sierra install I just used the NvidiaInjector in Clover to inject the native macOS NVIDIA drivers since I have a GeForce 2xx graphics card. Now, I thought this *should* have worked in Mojave despite the end of NVIDIA driver support in High Sierra, but there is one catch: only Web Drivers stopped working after HS, the built-in native ones for GeForce 2xx-6xx should still work. Despite this, I was still struggling to get it to work only to realize that the microarchitecture of the 200 series chipset doesn’t support Metal which is now the default graphics layer used by the macOS window server. Yet somehow, through some wonky patched drivers, I have some graphics acceleration working in Mojave with my GeForce GT 230M, at least enough to run the laptop display at its native resolution.
Honestly, I am shocked this system works at all. It is using incredibly outdated hardware and yet here I am, typing this blog post up on an 11 year computer running Apple’s second to most recent Mac operating system. Here is my. baseline benchmark for my primary workload, cross-compiling apps with Xamarin in Visual Studio for Mac. The test consists of a freshly created blank Xamarin Forms app targeting Android API v28 and iOS SDK 11.1. Here are the results:
HP Pavillion dv6-2000 (i7-720QM, 4 GB RAM, macOS 10.14.6)
2018 15″ MacBook Pro (i7-8750H, 16GB RAM, macOS 10.15.3)
Blank Xamarin Forms app compile times
The worlds worst Hackintosh takes almost 3 minutes to compile a blank Xamarin app compared to just under 18 seconds on a 2018 15″ MacBook Pro with a Core i7. Of course, we do have more cores at a higher clock speed, but there are clearly improvements to microarchitecture here as well. You would hope there would be such a divide, considering there is a 9 year age gap between these two machines.
For the past 6 years I have been hosting shravanj.com on a Raspberry Pi through a residential internet line. I first started out with a first generation model B, then upgraded to a 3rd gen model B. There was a massive speed upgrade between the two but this was later offset when I switched from a symmetrical fiber line with 75 Mbps up/down to a copper cable line with 400 Mbps down and 20 Mbps up. This, tied to the limited resource of the Pi, have become performance bottlenecks to my site over time. Over the past few days I have been looking around at moving my site to an off-premises host when I came across DigitalOcean’s incredibly well priced $5/month droplet which looked perfect for my needs. Today I migrated my entire site in about 45 minutes with just a few steps:
Create a new droplet with a Linux VM
Copy web data over SFTP
Copy and apply Apache vhost configs
Configure iptables for firewall and setup fail2ban
Point DNS to the droplet’s public IP
I am shocked at how easy it is to migrate static websites over to DigitalOcean, and the performance improvement is staggering. Prior to the migration, my homepage would take around 5 to 10 seconds to load; I am now seeing page loads in under a second. This is incredible for just $5 per month! Overall I am very satisfied with the platform and value, and will likely be using DigitalOcean for the foreseeable future.
CORS or Cross-Origin Resource Sharing is a web technology that allows cross-origin (read: requests coming from a different domain) API calls and resources to be shared. Typically, browsers protect against cross-origin calls but with CORS enabled, your browser will allow these requests when certain header values are returned by the request. This is especially useful when making AJAX calls to another domain’s API. When you are consuming someone else’s API, this technology has likely already been enabled on their end, but what about if you are writing your own Web API? Fortunately, development frameworks such as .NET Core have CORS support builtin via middleware. Microsoft provides some solid documentation on it here:https://docs.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-2.2, however, this only works up to .NET Core 2.2. If you are working with .NET Core 3.0 like I am, you will notice that the listed instructions for enabling CORS does not work. This actually has to do with the order of the setup calls being made, and which methods they are made in as shown in this issue on GitHub: https://github.com/aspnet/AspNetCore/issues/16672
The correct way to enable CORS in .NET Core 3.0 is as follows. Note that this configuration does the following:
Allows any origin (making the API fully accessible to any website or program that wants to call it
Allows any HTTP method whether it be GET, POST, PUT, OPTIONS, DELETE, etc
Allows any headers like Content-Type, Authorization, etc
You can limit these by explicitly listing which methods, headers, or origins you want to use, take a look at the Microsoft documentation for the syntax (it’ll still work in Core 3.0 as long as you follow the correct order below).
public void ConfigureServices(IServiceCollection services)
// Setup services (use AddCors after AddControllers and before AddMvc)
public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
// This call MUST be made between Routing and UseAuthorization
// For your API to be completely accessible to any public consumer, you should allow requests from any origin
// You can add restrictions for allowed methods and headers, but in this case we want to allow them all
options => options.SetIsOriginAllowed(x => _ = true).AllowAnyOrigin().AllowAnyMethod().AllowAnyHeader()
While this method seems convenient, it involves some additional setup. Since VS for Mac is a 3rd party application that needs to access information related to your Apple ID, you’ll need to generate an app specific password for authentication. This sounds easy, but you should note this also requires 2FA enabled on that account. I am a strong proponent for 2FA, but this may not work out so easily for everyone such as when your development device uses a different Apple ID than your Apple Developer associated Apple ID. Fortunately, there is a workaround using Xcode’s app distribution feature If you are logged into your development Apple ID in Xcode (Xcode > Preferences > Accounts), you can actually upload directly through the Organizer (Window > Organizer). As long as you have already generated the archive for publishing under VS for Mac, your archive will show up under the Xcode Organizer and you can sign and distribute the app via the usual channels (App Store, Ad Hoc, Enterprise).
While I was trying to fix some issues with push notifications on a backend system I noticed that the device token being generated from my iPhone X running iOS 13.1 looked quite strange as it was returning something that looked like this:
Note that this solution doesn’t take into account the formatting for iOS versions below 13, which is why I have written a code snippet that does:
There’s actually another issue that is causing problems with iOS 13 which has to do with the new apns-push-type. I will go over that in another write up, but for now I recommend reading the article from One Signal as it goes over this. As always, thanks for reading.
Although I wasn’t able to make it to WWDC this year, here is a quick roundup of some of my favorite things announced:
iOS 13 and iPadOS
iOS 13 sees some evolutionary improvements and new features, big ones including Dark Mode and Sign In with Apple. iPadOS brings a revamped OS designed for the larger displays offered on across the iPad lineup. I honestly wasn’t expecting a new OS for the iPad but damn it look fantastic. Features like a full file manager and downloads in Safari brings the iPad one step closer to replacing laptops for many users.
In addition to the crossover of unified iOS apps into macOS and the ability to ship iPad apps to macOS, one killer feature that really wowed me was Voice Control. The demo video is simply mind blowing and really show how far Apple has taken user accessibility. This feature goes a step further and works across macOS and iOS. Take a look at this demo, just wow: https://www.youtube.com/watch?v=aqoXFCCTfm4
We go back to the cheese grater Mac Pro era with the latest evolution of the Mac Pro. Up to 28 cores, 1.4TB of RAM, 4TB of blazing fast flash storage, and a 1400W power supply. I need one of these, along with the $6000 6K Pro Display and it’s $999 stand. Yeah its pricey but oh man it has been such a long time since Apple has released such a great looking piece of pro hardware.