LG UltraWide 29WK50S Review: The Dual Monitor Replacement

Ultrawide computer monitors are nothing new, in fact they have been around since 2014 or so but have only risen in popularity more recently. These monitors offer a 21:9 aspect ratio, making them wider but not taller than typical 16:9 displays. This allows for more horizontal screen real estate which makes them better suited for workloads that require many windows placed side by side. Prior to purchasing the 29WK50S I was using a 23″ and 21″ LG monitors, both 1920×1080 @ 60 Hz which were adequate for me. As I’ve continued spending more time on my computer since I’ve been working from home I started looking at upgrading the displays along with the mount they were on since it was not height adjustable and I couldn’t get the displays to a comfortable eye level. I knew that UltraWide monitors existed but always knew they were quite expensive, until now. The 29WK50S offers a 2560×1080 IPS panel at up to 75 Hz with FreeSync and 99% sRGB colorspace support for just under $200. After spending a week with the 29WK50S I can confidently say this is a fantastic bang for the buck UltraWide that can easily replace dual 23″ screens. Brightness and color reproduction is superb, bezels are pretty thin, and it has two HDMI outputs for easy dual computer connectivity. It’s pretty great for programming since I can have VS snap to half of the screen and a browser with Stack Overflow or documentation open in the other. If I need to read or edit a really long line of code I scan snap it to take up the entire horizontal space of the display and view the line without really scrolling. The only drawback I’ve noticed so far is watching videos from sources like YouTube, since most were recorded in a 16:9 aspect ratio. This means black bars on the side of the video but thanks to LED backlighting you don’t notice it too much. For games that do support 21:9 such as GTA V and Call of Duty: Modern Warfare the experience is very immersive. Although the display supports 75 Hz refresh rate with FreeSync, I wasn’t really able to utilize it due to FPS constraints caused by the GTX 1050 Ti in my system. Overall, the 29WK50S is a great display and a great entry point to 21:9 displays.

The App Store vs Developers

If you know me you probably know that I am a pretty big fan of most Apple products. I have personally owned 3 Macs and 3 iPhones over the course of 7 years and have a pretty big dependence on the iOS ecosystem as I do quite a bit of mobile development work professionally. Without iOS and it’s development platform, I wouldn’t have been able to kick start my career like I did when I developed my first commercially released iOS app 4 years ago. With all that said, I think that Apple has developed quite a track record with screwing over developers, the people who make their platforms so powerful.

I’m writing this in response to the recent battle between Apple and Basecamp, the group behind the Hey email app. While I don’t use Hey, I completely understand the outrage behind Apple’s move to force Hey to offer their subscription as an In-App Purchase, while letting other apps like Netflix and Amazon Prime Video get a free pass. At the core of it, Apple is taking a stance against smaller developers by forcing their hand into handing over a 30% cut on IAP sales. This is wrong. I would understand such a policy if it was enforced for any app that offers a subscription, but singling out developers is completely unfair. As someone who has had to deal with the App Store team firsthand, they mostly have good intentions about protecting the privacy of users and keeping the store well regulated, but when it involves financial motives it can quickly get ugly. Apple has developed a tendency to nickle and dime customers, and this trend is now making it’s way to their development platforms which is very disappointing.

At this point I don’t think it’s about the 30% commission, it’s Apple’s unfair treatment toward Basecamp while letting the bigger plays get a free pass. The fact that Apple doubled down on their demands and suggested the removal of Hey from the App Store if they don’t add their subscription just makes this even worse. Apple is looking more like Oracle with these type of tactics, where mafia mentality meets technology and platform management. The growing outrage from the developer community towards Apple is totally justified. Even though I have never launched a paid app or an IAP, I can sympathize with developers who have. From delivering lackluster releases of Xcode in recent history to ripping off smaller development teams, Apple is headed in a pretty dark direction from a developer’s standpoint. While can take a stance against Apple, due to their market share it’s hard to actually move away from their development ecosystem. All I can ask now is that Apple reverse their decision on Hey, and any other apps that have been subjected to this sort of extortion and allow developers to continue to provide the choice of purchasing subscriptions outside of the App Store. If Apple continues with this sort of behavior, who know’s what they will do next to screw over the people who make their products worth using.

Sennheiser HD58X Jubilee Review

While Sennheiser might not be a household name, they are well renowned within the enthusiast and professional space for making high performance, high quality audio equipment. Their most popular products include over the ear headphones and studio microphones. While they do also sell some more consumer focused gear, their best products are targeted towards the “audiophile” crowd and consist of the headphones in the HD series. The archetypal HD 580 solidified Sennheiser’s name in delivering (mostly) neutral yet highly enjoyable sound from headphones with generally large dynamic drivers. This series evolved into the HD 600, 650, and 660S. These headphones have generally been priced in the $399+ range, making them out of reach to most of the consumer market. That was until the group buy website Massdrop.com (now Drop.com) decided to partner up with Sennheiser to create the HD 58X Jubilee, a 150 ohm open back headphone that has a sound signature very close to the 600 series thanks to a very similar driver construction.

I’ve spent about a week with the HD 58X and they are by far the best set of headphones I have ever listened to. I also use a pair of Bose QC25s that are great for ANC but not the best for detail retrival, which the HD 58X seems to excel at. They deliver tight and controlled bass, a smooth mid range, and clear treble. This combination allows for vocals to really shine through while still retaining enough detail in instrumentals. Now, I am by no means an audiophile but I can definitely hear more resolution in music from both high bitrate FLAC and 320 kbps Ogg Vorbis streamed from Spotify on the very high preset. Although you could drive the HD 58X from onboard or even a mobile phone, I chose to also pick up a Schiit Audio Fulla 3, a $99 DAC/Amp combo that has plenty of power for these headphones. The Fulla 3 delivers nice, clean output without breaking the bank. From what I’ve heard, the HD 58X doesn’t scale particularly well with higher end amps so you probably don’t need to spend a ton more to get better sound out of these headphones.

Overall, I think you’d be hard pressed to find a better sounding combo for just under $300. It’s probably not an “endgame” pairing but it is a massive upgrade over the Bose QC 25’s and awful Realtek onboard audio I was coming from. The only complaint I have with the Fulla 3 is the volume knob – it feels incredibly cheap and is a bit squeaky when turning, but the potentiometer itself seems fine. Considering it was only $99 I can give it a pass. The HD 58X themselves are pretty great, the construction is mostly plastic but super light so it is very comfortable. The stock velour pads are a bit scratchy at first but get super comfy after breaking them in, as is with the clamp force which softens up after a week of use. I found myself using these headphones way more often compared to using my desktop speakers because I just love how they sound. If you are looking for an entry point into higher end audio, this is a great starting point while still being value minded.

Quick review of Parsec – In-home game streaming for the masses

I generally don’t review software but ever so often I will use something that works so well and is just incredible in terms of implementation that I just have to talk about it. For the past 4 days I haven’t been able to sit at my main desk space due to the air conditioning in my room not working. Despite having 2 laptops and full RDP access to my desktop that I use for work and school I still had one use case unaccounted for: playing PC games. Now, I’m not a competitive or hardcore gamer by any means but being able to play Call of Duty Warzone with friends has been a go-to social activity while we are all social distancing. I knew that RDP would not be a solution for gaming, but that’s when I remember watching a Linus Tech Tips video on Parsec. Although Linus and his team were using Parsec for video editing, Parsec brands itself as an in-home game streaming solution that allows users to remotely connect to their PCs with super low latency video encoding and decoding, making it ideal for gaming. I was a bit skeptical at first, thinking that delivering 1080p at 60 FPS with around 4-8ms of latency would be impossible but I was pleasantly surprised to see that Parsec delivers on it’s promise of low latency desktop streaming, and then some.

The setup was dead simple, just sign up for an account and setup Parsec on a host a client machine and you should be able to connect. My setup is pretty simple which involves my Dell Inspiron 3847 desktop (specs on my About page) and my Lenovo ThinkPad E750. The desktop is connected via Ethernet to a gigabit switch to a Spectrum “high performance” router. My ThinkPad is connected to this router through a 802.11ac 5 GHz connection, courtesy of an Intel Dual Band 2×2 8265 wireless chipset. I generally play with a PS3 controller, which is not supported natively in Windows 10. I already have this working with my desktop using and old version of SCP DS3 but I need to pass through the input on my ThinkPad, so I used Shibari since I was unable to find a working version of SCP DS3. Both SCP and Shibari make the PS3 DualShock controller appear as an Xinput controller, specifically an Xbox 360 controller, so games that have native controller support can work pretty much flawlessly. Parsec automatically has pass thru Xinput support, so no additional config was needed for my controller.

I finally fired up Call of Duty Warzone and wow is this piece of software impressive. Now there is definitely some input lag and this is due to both the video encode/decode latency and Vsync enabled in Parsec (but still disabled in the game). I opted to leave the Parsec Vsync on since I already experience a ton of screen tearing in Warzone (it pretty much maxes out the performance of my GTX 1050 Ti) and didn’t want to exacerbate the problem when I’m streaming it. Star Wars Battlefront II and Grand Theft Auto V performed quite a bit better since they rely less on low latency input and overall the experience was great, except for one complaint. Parsec can pass thru audio output just fine, but currently does not support mic input. This is fine for most of the games I play except for Warzone, and since my friends are playing on console they can’t use something like Discord for comms. The workaround proposed by Parsec is super janky as it consists of using two Discord accounts, a audio driver to loop back audio out to audio in, and having to subject your friends to incredibly annoying echo’s while talking. I tried this out and it did not bode well so I went back to just having to point to stuff in game, which makes communicating a lot harder in a game where good communication is essential. Overall, I’m still really impressed by Parsec and will continue to use it while I’m away from my desk.

macOS Mojave on a 2009 HP Laptop – The Return of the World’s Worst “Hackintosh”

You may remember that I installed macOS Sierra on my 2009 HP Pavillion dv6-2000t as referenced by this blog post: https://thestackunderflowblog.wordpress.com/2017/07/07/installing-macos-sierra-on-a-2009-hp-pavilion-laptop/. Since then I had to reinstall Windows 10 for use as a Plex server for a while. A few months later I was able to grab an old SFF Dell desktop with a 3rd gen i5 to take over Plex duties, leading my HP to retirement yet again. At this point I figured I should just recycle the computer since I don’t really have much use for it, until earlier this week. I decided to dig it up to give it one more shot at life, with one more shot at a working macOS install.

My Sierra install was pretty smooth except for one jarring issue: lack of proper CPU management. I don’t think I was getting the full performance out of the first gen Core i7-720QM inside of macOS and I never could figure out why. Actually, I am still not sure if I’m getting the full performance in Mojave but so far it seems faster than my Sierra install so I’ll mark that as a win in my book. Now you might ask, why not Catalina instead of Mojave? Simply put I just couldn’t get the Catalina installer to boot, so maybe my hardware is just too old. I actually like Mojave better since it has less of the annoyances that were introduced in Catalina. Overall the install process was pretty similar, with one headache this time being graphics. With my Sierra install I just used the NvidiaInjector in Clover to inject the native macOS NVIDIA drivers since I have a GeForce 2xx graphics card. Now, I thought this *should* have worked in Mojave despite the end of NVIDIA driver support in High Sierra, but there is one catch: only Web Drivers stopped working after HS, the built-in native ones for GeForce 2xx-6xx should still work. Despite this, I was still struggling to get it to work only to realize that the microarchitecture of the 200 series chipset doesn’t support Metal which is now the default graphics layer used by the macOS window server. Yet somehow, through some wonky patched drivers, I have some graphics acceleration working in Mojave with my GeForce GT 230M, at least enough to run the laptop display at its native resolution.

Honestly, I am shocked this system works at all. It is using incredibly outdated hardware and yet here I am, typing this blog post up on an 11 year computer running Apple’s second to most recent Mac operating system. Here is my. baseline benchmark for my primary workload, cross-compiling apps with Xamarin in Visual Studio for Mac. The test consists of a freshly created blank Xamarin Forms app targeting Android API v28 and iOS SDK 11.1. Here are the results:

HP Pavillion dv6-2000 (i7-720QM, 4 GB RAM, macOS 10.14.6)02:44.58
2018 15″ MacBook Pro (i7-8750H, 16GB RAM, macOS 10.15.3)00:17.74
Blank Xamarin Forms app compile times

The worlds worst Hackintosh takes almost 3 minutes to compile a blank Xamarin app compared to just under 18 seconds on a 2018 15″ MacBook Pro with a Core i7. Of course, we do have more cores at a higher clock speed, but there are clearly improvements to microarchitecture here as well. You would hope there would be such a divide, considering there is a 9 year age gap between these two machines.

Migrating shravanj.com to DigitalOcean

For the past 6 years I have been hosting shravanj.com on a Raspberry Pi through a residential internet line. I first started out with a first generation model B, then upgraded to a 3rd gen model B. There was a massive speed upgrade between the two but this was later offset when I switched from a symmetrical fiber line with 75 Mbps up/down to a copper cable line with 400 Mbps down and 20 Mbps up. This, tied to the limited resource of the Pi, have become performance bottlenecks to my site over time. Over the past few days I have been looking around at moving my site to an off-premises host when I came across DigitalOcean’s incredibly well priced $5/month droplet which looked perfect for my needs. Today I migrated my entire site in about 45 minutes with just a few steps:

  1. Create a new droplet with a Linux VM
  2. Copy web data over SFTP
  3. Copy and apply Apache vhost configs
  4. Configure iptables for firewall and setup fail2ban
  5. Point DNS to the droplet’s public IP

I am shocked at how easy it is to migrate static websites over to DigitalOcean, and the performance improvement is staggering. Prior to the migration, my homepage would take around 5 to 10 seconds to load; I am now seeing page loads in under a second. This is incredible for just $5 per month! Overall I am very satisfied with the platform and value, and will likely be using DigitalOcean for the foreseeable future.

Enabling CORS Support in .NET Core 3.0 Web API

CORS or Cross-Origin Resource Sharing is a web technology that allows cross-origin (read: requests coming from a different domain) API calls and resources to be shared. Typically, browsers protect against cross-origin calls but with CORS enabled, your browser will allow these requests when certain header values are returned by the request. This is especially useful when making AJAX calls to another domain’s API. When you are consuming someone else’s API, this technology has likely already been enabled on their end, but what about if you are writing your own Web API? Fortunately, development frameworks such as .NET Core have CORS support builtin via middleware. Microsoft provides some solid documentation on it here:https://docs.microsoft.com/en-us/aspnet/core/security/cors?view=aspnetcore-2.2, however, this only works up to .NET Core 2.2. If you are working with .NET Core 3.0 like I am, you will notice that the listed instructions for enabling CORS does not work. This actually has to do with the order of the setup calls being made, and which methods they are made in as shown in this issue on GitHub: https://github.com/aspnet/AspNetCore/issues/16672

The correct way to enable CORS in .NET Core 3.0 is as follows. Note that this configuration does the following:

  • Allows any origin (making the API fully accessible to any website or program that wants to call it
  • Allows any HTTP method whether it be GET, POST, PUT, OPTIONS, DELETE, etc
  • Allows any headers like Content-Type, Authorization, etc

You can limit these by explicitly listing which methods, headers, or origins you want to use, take a look at the Microsoft documentation for the syntax (it’ll still work in Core 3.0 as long as you follow the correct order below).

        public void ConfigureServices(IServiceCollection services)
        {
            // Setup services (use AddCors after AddControllers and before AddMvc)
            services.AddControllers();
            services.AddCors();
        }


        public void Configure(IApplicationBuilder app, IWebHostEnvironment env)
        {
            if (env.IsDevelopment())
            {
                app.UseDeveloperExceptionPage();
            }

            app.UseHttpsRedirection();

            app.UseRouting();
            
           // This call MUST be made between Routing and UseAuthorization
           // For your API to be completely accessible to any public consumer, you should allow requests from any origin
           // You can add restrictions for allowed methods and headers, but in this case we want to allow them all
            app.UseCors(
                options => options.SetIsOriginAllowed(x => _ = true).AllowAnyOrigin().AllowAnyMethod().AllowAnyHeader()
            );

            app.UseAuthorization();

            app.UseEndpoints(endpoints =>
            {
                endpoints.MapControllers();
            });
        }

Xamarin Tips and Tricks: Uploading iOS App Archives using Xcode

Xcode 11 drops support for Application Loader which was the de facto standard for uploading apps whether they were developed natively or through a cross platform framework like Xamarin. Microsoft has since updated their documentation to show how to upload apps directly via VS for Mac: https://docs.microsoft.com/en-us/xamarin/ios/deploy-test/app-distribution/app-store-distribution/publishing-to-the-app-store?tabs=macos

While this method seems convenient, it involves some additional setup. Since VS for Mac is a 3rd party application that needs to access information related to your Apple ID, you’ll need to generate an app specific password for authentication. This sounds easy, but you should note this also requires 2FA enabled on that account. I am a strong proponent for 2FA, but this may not work out so easily for everyone such as when your development device uses a different Apple ID than your Apple Developer associated Apple ID.  Fortunately, there is a workaround using Xcode’s app distribution feature If you are logged into your development Apple ID in Xcode (Xcode > Preferences > Accounts), you can actually upload directly through the Organizer (Window > Organizer). As long as you have already generated the archive for publishing under VS for Mac, your archive will show up under the Xcode Organizer and you can sign and distribute the app via the usual channels (App Store, Ad Hoc, Enterprise).

Screen Shot 2019-10-15 at 8.59.18 AM

Issues with push notification device tokens on iOS 13 (including a fix for Xamarin)

While I was trying to fix some issues with push notifications on a backend system I noticed that the device token being generated from my iPhone X running iOS 13.1 looked quite strange as it was returning something that looked like this:

{length = 32, bytes = 0x965b251c 6cb1926d e3cb366f dfb16ddd … 5f857679 376eab7c }

When I was expecting it to look more like this:

<965b251c 6cb1926d e3cb366f dfb16ddd e6b9086a 8a3cac9e 5f857679 376eab7C>

 

I soon found out, thanks to the articles listed below, that there is a breaking change in the way device tokens are accessed on iOS 13:

https://nshipster.com/apns-device-tokens/

https://onesignal.com/blog/ios-13-introduces-4-breaking-changes-to-notifications/

The fixes for Swift and Objective-C are provided, but what about for Xamarin? Thanks to Stack Overflow, we have the solution for that as well:

https://stackoverflow.com/questions/58027344/how-to-get-device-token-in-ios-13-with-xamarin

Note that this solution doesn’t take into account the formatting for iOS versions below 13, which is why I have written a code snippet that does:

 

There’s actually another issue that is causing problems with iOS 13 which has to do with the new apns-push-type. I will go over that in another write up, but for now I recommend reading the article from One Signal as it goes over this. As always, thanks for reading.