Product Recommendation: NETGEAR WAX202 WiFi 6 Access Point

I have been a fan of Ubiquiti Access Points after setting one up for the first time almost 2 years ago, and currently have a U6 Lite in my home. Recently, I had to move the U6 to a different location on the second floor, and I immediately noticed some issues with range on the first floor. I used the UniFi app to adjust the radio gain from ‘Auto’ to ‘High’, which helped, but I still wasn’t getting great speeds. I started looking at adding a second access point for the first floor, and initially considered the U6 Mesh. However, it’s been out of stock for a while on ui.com, and $179 is a bit steep considering the U6 Lite was $99. After some Googling, I came across some WiFi 6 access points made by NETGEAR. The WAX202 seemed like it would work nicely, and at just $39.99 (I got it for even less with a coupon, directly from netgear.com), I figured I should give it a try.

Setup was pretty straight forward, and the GUI is easy to navigate but pretty basic. Since this was going to be a secondary AP, I wasn’t looking for fancy features anyways. I run my U6 Lite without a dedicated UniFi controller, so I didn’t mind the WAX202 ‘s lack of support for a centralized management system. I ran a speed test on my iPhone 12, and was pleasantly surprised to see it outperform the U6 Lite with download and upload throughput. The range was better than the U6 Lite too, as I was still able to get great speeds on the second floor. Even with sub-optimal placement, the WAX202 still works well. In fact, I probably could get by with just the WAX202 alone, but I’ll be running both for now. Since I used it in AP mode, I cannot comment on it’s routing capabilities, but I can attest that the built-in 1 GbE switch works just fine.

The few negatives that I can think of are more nitpicks than anything. The status LEDs are overly bright, but they can be disabled in settings. Build quality is pretty average if not slightly below, but for $40.00 I cannot really complain. For certain power users, the settings might be a bit too basic, and no management interface might be a deal breaker for small business deployments.

If you can find one on sale, the WAX202 might be one of the best budget WiFi 6 access points you can buy for covering small to medium sized spaces.

Dell Thunderbolt Dock WD19TB dual monitor troubleshooting

I’ve been using a Dell WD19TB dock at home for over a year now, and it has become one of my favorite pieces of WFH hardware. I can connect all of my peripherals through just a single connection, along with full power delivery to my laptop, a Dell Latitude with a 10th gen i7. Recently, I purchased a 32” LG 4K monitor to act as my primary display, moving my 29” Ultrawide to the side as a secondary. I knew that modern Intel integrated graphics, and the dock itself, could easily support this configuration. What I did not account for is the correct combination of ports and cables you need to get this setup working.

I first plugged in the 4K display over the included DisplayPort 1.2 cable, and the Ultrawide over HDMI 1.4. The 4K display worked just fine, but the Ultrawide was getting capped at 1920×1080, whereas the native resolution is 2560×1080. This was not a problem with the cable, as I had used the same to connect to the dock before I added the 4K display. I then plugged the HDMI into the laptop directly, and the native resolution was immediately available. After sifting through Dell’s documentation, I found out that there is a bandwidth and resolution cap I was hitting by using this combination of HDMI and DisplayPort. To get around this, I needed one source over HDMI, and the other over USB-C. I connected the 4K display to the HDMI 2.0 port on the dock, and the Ultrawide over a HDMI to USB-C cable. At first I plugged it into the front USB Type-C port, as the documentation stated you can’t use the one on the back next to the HDMI port while HDMI is active. The display was unrecognized, and I got a “Unknown USB device warning” in Windows. I then used the Thunderbolt 3 port on the back of the dock (next to the power connector), and the second display immediately appeared.

Plug in your HDMI to USB-C cable here

To summarize, my config utilizes the following:

LG 1080p Ultrawide -> HDMI to USB-C cable -> Thunderbolt 3 port on dock

LG 4K -> HDMI 2.0 cable -> HDMI 2.0 port on dock

I hope this helps anyone else who has trouble with getting the right ports and cables to utilize multiple monitors on the WD19TB.

From Raspberry Pi to 24 Core Server: The evolution of my at-home computing infrastructure

My journey in building a “homelab”, like many others, started out with the purchase of a 2nd generation Raspberry Pi model B in 2015. It was connected to a now aging ASUS 802.11ac router, and foolishly double-NAT’d to an archaic Verizon FiOS router (this was back in high school when I knew very little about networking). I managed to dig up a picture I took for the presentation that coincided with the school project I bought the RPi for:

Fast forward to 2021, and here’s what my setup looks like today:

EdgeRouter X with a UniFi AP-AC PRO
Late 2014 Mac mini connected to a unmanaged TP Link switch
Refurbished Dell Precision 7810 with Dual Xeons, 128 GB of ECC DDR4, 2 TB of flash backed storage, and 1 TB of spinning drive backed storage

Compute Upgrades

The upgrade process to get to today’s end result didn’t happen overnight; over the course of the past 6 years I have gone through a 3rd generation Raspberry Pi, an old HP laptop (the same one used in my “World’s worst Hackintosh” post), and a Dell SFF desktop provided by a previous employer that was about to throw it out. The first RPi, and subsequent pieces of hardware, mainly served as file and HTTP servers. I was running my main website, shravanj.com, at home on the first and third generation RPis for several years up until last year when I migrated the site to Digital Ocean. After moving to a more Windows based infrastructure with the HP laptop and Dell SFF, I began running Plex for in home streaming and media management. Little did I know that my needs would continue to grow past Plex, to the point where I now run two Aerospike CE instances, Mongo DB, Microsoft SQL Server, Plex, several ASP.NET Core applications in IIS, and more on a single machine. This is all possible thanks to the refurbished Precision 7810 tower. Equipped with dual Intel Xeon E5-2678s totaling 24 cores and 48 threads, 128 GB of ECC DDR4 RAM, a 480 GB Intel DC series SSD, 1 TB Samsung 860 EVO SSD, 500 GB SanDisk Ultra SSD, and a 1 TB WD Blue spinning hard drive, this machine still has plenty of headroom for future projects. Instead of running Windows on bare metal like I’ve done previously, I instead opted to run Proxmox VE as my hypervisor and run Windows Server 2019 in a single VM.

Proxmox VE dashboard

As you can see from the dashboard, I’m barely pushing the machine. I have Aerospike 5 Community Edition running in two CentOS 7 VMs for full replication, and a third CentOS 7 VM running MS SQL Server and Mongo DB 4.4. I do quite a bit of experimentation with databases, as well as supporting the Pseudo Markets project through this infrastructure, hence having 3 VMs just dedicated to running database engines. Complimenting the Precision tower is the 2014 Mac mini which primarily serves as a SMB file store since I do mobile development on my 2018 MacBook Pro these days.

Network Upgrades

After deploying Ubiquiti UniFi equipment for a previous employer, I realized how bad most consumer networking gear is. The ASUS router was pretty solid 5 years ago, but pales in comparison to the UAP-AC-PRO and EdgeRouter X combo I am currently running. Admittedly, I would now spring for the newer UniFi 6 APs, but these were not available when I setup the AC PRO. As for the EdgeRouter X, I thought it delivered the best price to performance ratio for my network setup. Having worked with a UniFi Security Gateway and UDM Pro, I didn’t need that level of orchestration since I wasn’t going to use any UniFi managed switches or any additional UniFi controller features. The EdgeRouter X is able to max out my 500/500 Mbps fiber connection and provide rock solid gigabit performance within my intranet. It still offers powerful routing and management features, and does not require a controller since it runs Edge OS. I run the AC Pro in standalone mode, using the UniFi app on my iPhone to act as it’s controller for initial setup. But after all that, it has been set and forget, and that’s what I really like about Ubiquiti’s gear. There are some other great products our there like MikroTik’s RouterBOARDs, but the ease of use and familiarity I have with Ubiquiti products made this combo a no-brainer for me. Aside from that, there’s nothing too fancy going on with the innerworkings of the network. It’s pretty standard with the ERX managing DHCP with a few static IP allocations, hairpin NAT, HTTP and HTTPS port forwarding, and such.

Overall I am very pleased with this setup, and it’s fun to look back and see how my compute infrastructure has grown from a single RPi to a full out virtualized server environment. All of this was done with budget in mind as well, such as buying the Dell Precision as a refurb. You don’t have to have a corporate datacenter budget to build your own mini datacenter at home.

PlayStation 5 Review: Next-gen gaming is finally here

The Sony PlayStation 5 might be the most anticipated product release of this year, and now seems to be sold out almost everywhere after pre-orders were shipped and remaining stock has been depleted. I was very lucky to secure a pre-order back in September and am happy to report that after 3 days with the console I can confidently say that this $500 machine is an absolute marvel of engineering, and is the most forward thinking gaming console I have ever owned. The PS5, along with the competing Series X and Series S, brought several long awaited features to the table. Native, or near native 4K rendering capabilities, support for high refresh rate displays (up to 120 Hz), hardware accelerated ray tracing, and ultra fast solid state storage. But to one up the Series X, Sony had to think out of the box and redesign the controller, dubbed the DualSense, to create a significantly more immersive experience. This review will detail my experiences with the PlayStation 5 hardware, system software, controller, and a few games.

In short, the PlayStation 5 is built using relatively new technology from AMD. The CPU is based on the Zen 2 architecture, with 8 cores and 16 threads @ 3.5 GHz max frequency. It’s closest relative seems to be the Ryzen 7 3700X, also built on TSMC’s 7 nm process. The Ryzen series of processors finally brought AMD back to a competitive state with Intel back in 2016, and can be seen as a quantum leap compared to the Jaguar core found in the PlayStation 4 and 4 Pro. The GPU is actually brand new, and features an RDNA 2 based architecture, with 36 CUs clocked in at up to 2.3 GHz, producing a maximum of 10.3 TFLOPS of compute power. It also features hardware accelerated ray tracing capabilities and HDMI 2.1 compliant output, which is good for 4K HDR video at up to 120 FPS with 10 bit color and support for 8K at 60 FPS, as long as your TV can support it. Shared between the CPU and GPU is 16 GB of GDDR6 RAM, with 448 GB/s of bandwidth. While these are already impressive specs, the PS5 also features a custom built 825 GB NVMe SSD, with 5.5 GB/s of throughput in typical workloads, and up to 8-9 GB/s with data compressed using RAD Game Tools’ Kraken protocol thanks to hardware accelerated decompression. This is likely the fastest non-volatile storage to ever ship in a consumer electronics product, and definitely the fastest to ship in a gaming console. Additional improvements include the usage of Liquid Metal to cool the CPU (again a first in a consumer electronics product), a massive heat spreader and fan to keep the system cool and quiet, support for future M.2 NVMe storage expansion, USB 3.1 Gen 2 ports, and a UHD Blu-Ray disc drive. All of theses components, when put together, create an absolute marvel of machine that will impress almost all gamers, be casual or hardcore.

Like I mentioned earlier, the DualSense controller is one way that Sony set out to set the PS5 apart from the Series X/S. When I first read about the controller I just thought it was just a fancier version of the DualShock 4 but it is WAY more than that. The DualSense uses some very precise and sophisticated haptic feedback technology to deliver truly next-gen feel when properly utilized. Astro’s Playroom is the ultimate showcase for the controller, as you can feel a sensation of the surface you are walking on through the controller. You have to feel it to believe it, and it is incredible for the first time. Along with the haptics, the controller also features adaptive triggers for R2 and L2 that can have varying levels of resistance, again based on how the game developer decided to utilize it. In Call of Duty: Black Ops Cold War, both triggers have resistance applied to them, and R2 has a variable amount of resistance based on the gun that is equipped. The overall build quality of the controller is also much improved, and the audio output directly from the controller seems to be improved quite a bit as well. Sony has touted it’s Tempest 3D AudioTech as a headphone exclusive feature at launch, and the controller seems to be doing a pretty good job at passing that audio data along. I think the DualSense is an excellent controller, and hope to see its unique features utilized in future games.

Finally, we come to the actual gaming experience. For a point of reference, the PlayStation 2 is the most recent Sony system I had owned prior to getting the PS5. There was a long period where I didn’t really play games in general, but thanks to COVID-19 I picked up PC gaming earlier this year, starting with Call of Duty Warzone. Since then, I have continued to purchase several other PC games and have had a pretty decent experience on my system, but the GTX 1050 Ti in it was starting to show it’s age. I wanted something that could deliver higher framerates at a much higher resolution, so I figured the PS5 was the right choice. I currently have the system connected to a Sony Bravia X900H, which is a fantastic TV irrespective of it’s gaming capabilities. So far, I have spent most of my time playing Black Ops Cold War at a buttery smooth 120 FPS thanks to performance mode on the PS5. In performance mode, the resolution isn’t native 4K, but at the distance I am viewing from I can’t really tell. What I can tell is the massive jump from 60 FPS to 120 FPS, which is perfect for fast paced first person shooters like COD. Game play is extremely smooth and consistent, with almost no perceptible framerate drops or stuttering. PS4 titles also look great, and I am excited to see games like Modern Warfare receiving high res texture pack updates to take advantage of the extra pixels the next gen consoles can push out. With all this said, there are still some issues that seem to be related more towards the PS5 system software. I had Black Ops Cold War suddenly exit twice, one time causing the PS5 to reboot and rebuild it’s system database (I think this is just a file system repair since the OS didn’t shut down cleanly). Thankfully the rebuild was super quick, and everything was working when I got back to the home screen. I’d honestly be more surprised to see the system work flawlessly out of the box, this is day 1 hardware and software after all. I should also note enabling 120 Hz support wasn’t an easy process. On the X900H, there are only two HDMI 2.1 compliant ports available, HDMI 3 and HDMI 4. In the TV settings, you need to manually set the HDMI Output mode on port 3 or 4 to ‘Enhanced’ for HDR and 120 Hz support. After this, you’ll need to set your PS5 game settings globally to ‘Performance’ in order to get 120 FPS in Black Ops Cold War. It is such a clunky process to set these things up, and I’m hoping future software upgrades to both TVs and PS5s make this process easier.

In conclusion, the PlayStation 5 brings next-gen features at a relatively consumer friendly price. You get very capable hardware, games that will only continue to improve as developers spend more time with the system, and a nice and clean UI. This generation also brings high framerate support to the table, a long awaited feature for consoles and I’m excited to see more games take advantage of it. Overall, I’m very pleased with the overall performance of this console. Once availability improves and more people are able to get this system in their living rooms, I think Sony will again retake the throne for this generation of console. If you can get your hands on one at MSRP, I highly recommend the PlayStation 5.

iPhone 12 Review: Evolutionary

I’ve spent a little over a week with my iPhone 12 and can sum up the experience with just one word (or phrase): evolutionary, rather than revolutionary. Coming from an iPhone X, I was already accustomed to the OLED screen, Face ID, great camera technology, and super snappy experience in iOS. In general, I don’t think it’s a massive upgrade but I still think its worth it. It feels strange to spend less money on a newer model iPhone, given the starting prices have continued to rise within the past few years. I opted for a blue, non-Pro 128 GB configuration. The 12 “Pro” brings cameras and fancier chassis to the line up, but has the same display, processor, and 5G capabilities as the regular old 12 and 12 mini. Speaking of the mini, I was quite happy to see Apple embrace smaller phones. Personally, I like the 6.1″ form factor, a small size bump over the X’s 5.8″ display, but I totally get the appeal for a mini version of an otherwise large phone.

Design

The design of the iPhone 12 is more of a throwback to the iPhone 4/5 days than anything else, with the boxy edges making a return. I welcome this change, since I didn’t think the 6/7/8 were good looking phones (the X did look nice though). I’ve always purchased darker colors of iPhones, typically space grey and black but decided to go for the dark blue this time. The color doesn’t look that great in photos but in person it does look quite nice. This hardly matters though, since most people including myself will put a case on the phone that hides the color. The camera system still looks a bit silly, but overall I think the 12 is a good looking phone. Would be nice if they made the non Pro 12 available in Pacific Blue though.

Performance

I don’t do mobile development work anymore, so I can’t make any developer centric comments on performance, but in general usage it does feel faster than the iPhone X in almost every regard. Unlocking the phone using Face ID, launching apps, and multitasking are a breeze. Battery life is improved too, thanks to a more efficient SoC built on TSMC’s 5 nm manufacturing process. But Apple didn’t show off the A14 Bionic’s heroic performance like they usually do during the keynote, and rather focused on the 5G capabilities. 5G, despite what Verizon says, still is not ready for prime time in my opinion. Coverage is ok, but in most places you won’t really see performance that surpasses high performing LTE. I’m currently on T-Mobile’s 5G network in North Texas, and based on some preliminary speed tests it doesn’t seem to offer any improvement over LTE. This is likely because I don’t have access to “mid-band” 5G, which is supposed to offer good coverage and higher speeds than LTE. Verizon is currently leading the way in millimeter wave or “ultra-wide band” 5G, which offers near gigabit speed but at the cost of range; you’ll see many other reviewers were able to test this out with direct line of sight to a mm wave capable cell tower. 5G definitely is the future, but the future isn’t here just yet. The launch of the iPhone 12 will definitely serve as a catalyst for carriers to continue to grow their 5G networks, since there will be greater demand for the faster speeds and capacity. All of this performance does come with a drawback being increased power consumption. Since I’m working from home, I can’t really tell how big this impact is since I’m always connected to Wi-Fi. The Smart Data Mode found on the new iPhones claims to mitigate this by only using 5G when necessary, and falling back to LTE to save power. I’ll need to spend more time with the phone outside to test this, but the battery life seems fine otherwise.

Final thoughts

I think the iPhone 12 is a great phone, pretty much like every iPhone I have owned. I’d have to admit I am biased towards Apple hardware because I’ve just had a great experience with every Apple product I’ve owned. While not a revolutionary product, I still think the iPhone 12 is important for the future of the iPhone. It has finally brought an OLED screen to the “base” phone, and closer feature parity to the Pro variants. This means you’ll be able to get a great phone without going for the top of the line model. There is nothing that blows me away about this product, and there is nothing wrong with that. It’s a well engineered device, with great software and (mostly) future proof hardware. What else is there to ask for in a phone?

LG UltraWide 29WK50S Review: The Dual Monitor Replacement

Ultrawide computer monitors are nothing new, in fact they have been around since 2014 or so but have only risen in popularity more recently. These monitors offer a 21:9 aspect ratio, making them wider but not taller than typical 16:9 displays. This allows for more horizontal screen real estate which makes them better suited for workloads that require many windows placed side by side. Prior to purchasing the 29WK50S I was using a 23″ and 21″ LG monitors, both 1920×1080 @ 60 Hz which were adequate for me. As I’ve continued spending more time on my computer since I’ve been working from home I started looking at upgrading the displays along with the mount they were on since it was not height adjustable and I couldn’t get the displays to a comfortable eye level. I knew that UltraWide monitors existed but always knew they were quite expensive, until now. The 29WK50S offers a 2560×1080 IPS panel at up to 75 Hz with FreeSync and 99% sRGB colorspace support for just under $200. After spending a week with the 29WK50S I can confidently say this is a fantastic bang for the buck UltraWide that can easily replace dual 23″ screens. Brightness and color reproduction is superb, bezels are pretty thin, and it has two HDMI outputs for easy dual computer connectivity. It’s pretty great for programming since I can have VS snap to half of the screen and a browser with Stack Overflow or documentation open in the other. If I need to read or edit a really long line of code I scan snap it to take up the entire horizontal space of the display and view the line without really scrolling. The only drawback I’ve noticed so far is watching videos from sources like YouTube, since most were recorded in a 16:9 aspect ratio. This means black bars on the side of the video but thanks to LED backlighting you don’t notice it too much. For games that do support 21:9 such as GTA V and Call of Duty: Modern Warfare the experience is very immersive. Although the display supports 75 Hz refresh rate with FreeSync, I wasn’t really able to utilize it due to FPS constraints caused by the GTX 1050 Ti in my system. Overall, the 29WK50S is a great display and a great entry point to 21:9 displays.

The App Store vs Developers

If you know me you probably know that I am a pretty big fan of most Apple products. I have personally owned 3 Macs and 3 iPhones over the course of 7 years and have a pretty big dependence on the iOS ecosystem as I do quite a bit of mobile development work professionally. Without iOS and it’s development platform, I wouldn’t have been able to kick start my career like I did when I developed my first commercially released iOS app 4 years ago. With all that said, I think that Apple has developed quite a track record with screwing over developers, the people who make their platforms so powerful.

I’m writing this in response to the recent battle between Apple and Basecamp, the group behind the Hey email app. While I don’t use Hey, I completely understand the outrage behind Apple’s move to force Hey to offer their subscription as an In-App Purchase, while letting other apps like Netflix and Amazon Prime Video get a free pass. At the core of it, Apple is taking a stance against smaller developers by forcing their hand into handing over a 30% cut on IAP sales. This is wrong. I would understand such a policy if it was enforced for any app that offers a subscription, but singling out developers is completely unfair. As someone who has had to deal with the App Store team firsthand, they mostly have good intentions about protecting the privacy of users and keeping the store well regulated, but when it involves financial motives it can quickly get ugly. Apple has developed a tendency to nickle and dime customers, and this trend is now making it’s way to their development platforms which is very disappointing.

At this point I don’t think it’s about the 30% commission, it’s Apple’s unfair treatment toward Basecamp while letting the bigger plays get a free pass. The fact that Apple doubled down on their demands and suggested the removal of Hey from the App Store if they don’t add their subscription just makes this even worse. Apple is looking more like Oracle with these type of tactics, where mafia mentality meets technology and platform management. The growing outrage from the developer community towards Apple is totally justified. Even though I have never launched a paid app or an IAP, I can sympathize with developers who have. From delivering lackluster releases of Xcode in recent history to ripping off smaller development teams, Apple is headed in a pretty dark direction from a developer’s standpoint. While can take a stance against Apple, due to their market share it’s hard to actually move away from their development ecosystem. All I can ask now is that Apple reverse their decision on Hey, and any other apps that have been subjected to this sort of extortion and allow developers to continue to provide the choice of purchasing subscriptions outside of the App Store. If Apple continues with this sort of behavior, who know’s what they will do next to screw over the people who make their products worth using.

Sennheiser HD58X Jubilee Review

While Sennheiser might not be a household name, they are well renowned within the enthusiast and professional space for making high performance, high quality audio equipment. Their most popular products include over the ear headphones and studio microphones. While they do also sell some more consumer focused gear, their best products are targeted towards the “audiophile” crowd and consist of the headphones in the HD series. The archetypal HD 580 solidified Sennheiser’s name in delivering (mostly) neutral yet highly enjoyable sound from headphones with generally large dynamic drivers. This series evolved into the HD 600, 650, and 660S. These headphones have generally been priced in the $399+ range, making them out of reach to most of the consumer market. That was until the group buy website Massdrop.com (now Drop.com) decided to partner up with Sennheiser to create the HD 58X Jubilee, a 150 ohm open back headphone that has a sound signature very close to the 600 series thanks to a very similar driver construction.

I’ve spent about a week with the HD 58X and they are by far the best set of headphones I have ever listened to. I also use a pair of Bose QC25s that are great for ANC but not the best for detail retrival, which the HD 58X seems to excel at. They deliver tight and controlled bass, a smooth mid range, and clear treble. This combination allows for vocals to really shine through while still retaining enough detail in instrumentals. Now, I am by no means an audiophile but I can definitely hear more resolution in music from both high bitrate FLAC and 320 kbps Ogg Vorbis streamed from Spotify on the very high preset. Although you could drive the HD 58X from onboard or even a mobile phone, I chose to also pick up a Schiit Audio Fulla 3, a $99 DAC/Amp combo that has plenty of power for these headphones. The Fulla 3 delivers nice, clean output without breaking the bank. From what I’ve heard, the HD 58X doesn’t scale particularly well with higher end amps so you probably don’t need to spend a ton more to get better sound out of these headphones.

Overall, I think you’d be hard pressed to find a better sounding combo for just under $300. It’s probably not an “endgame” pairing but it is a massive upgrade over the Bose QC 25’s and awful Realtek onboard audio I was coming from. The only complaint I have with the Fulla 3 is the volume knob – it feels incredibly cheap and is a bit squeaky when turning, but the potentiometer itself seems fine. Considering it was only $99 I can give it a pass. The HD 58X themselves are pretty great, the construction is mostly plastic but super light so it is very comfortable. The stock velour pads are a bit scratchy at first but get super comfy after breaking them in, as is with the clamp force which softens up after a week of use. I found myself using these headphones way more often compared to using my desktop speakers because I just love how they sound. If you are looking for an entry point into higher end audio, this is a great starting point while still being value minded.

Quick review of Parsec – In-home game streaming for the masses

I generally don’t review software but ever so often I will use something that works so well and is just incredible in terms of implementation that I just have to talk about it. For the past 4 days I haven’t been able to sit at my main desk space due to the air conditioning in my room not working. Despite having 2 laptops and full RDP access to my desktop that I use for work and school I still had one use case unaccounted for: playing PC games. Now, I’m not a competitive or hardcore gamer by any means but being able to play Call of Duty Warzone with friends has been a go-to social activity while we are all social distancing. I knew that RDP would not be a solution for gaming, but that’s when I remember watching a Linus Tech Tips video on Parsec. Although Linus and his team were using Parsec for video editing, Parsec brands itself as an in-home game streaming solution that allows users to remotely connect to their PCs with super low latency video encoding and decoding, making it ideal for gaming. I was a bit skeptical at first, thinking that delivering 1080p at 60 FPS with around 4-8ms of latency would be impossible but I was pleasantly surprised to see that Parsec delivers on it’s promise of low latency desktop streaming, and then some.

The setup was dead simple, just sign up for an account and setup Parsec on a host a client machine and you should be able to connect. My setup is pretty simple which involves my Dell Inspiron 3847 desktop (specs on my About page) and my Lenovo ThinkPad E750. The desktop is connected via Ethernet to a gigabit switch to a Spectrum “high performance” router. My ThinkPad is connected to this router through a 802.11ac 5 GHz connection, courtesy of an Intel Dual Band 2×2 8265 wireless chipset. I generally play with a PS3 controller, which is not supported natively in Windows 10. I already have this working with my desktop using and old version of SCP DS3 but I need to pass through the input on my ThinkPad, so I used Shibari since I was unable to find a working version of SCP DS3. Both SCP and Shibari make the PS3 DualShock controller appear as an Xinput controller, specifically an Xbox 360 controller, so games that have native controller support can work pretty much flawlessly. Parsec automatically has pass thru Xinput support, so no additional config was needed for my controller.

I finally fired up Call of Duty Warzone and wow is this piece of software impressive. Now there is definitely some input lag and this is due to both the video encode/decode latency and Vsync enabled in Parsec (but still disabled in the game). I opted to leave the Parsec Vsync on since I already experience a ton of screen tearing in Warzone (it pretty much maxes out the performance of my GTX 1050 Ti) and didn’t want to exacerbate the problem when I’m streaming it. Star Wars Battlefront II and Grand Theft Auto V performed quite a bit better since they rely less on low latency input and overall the experience was great, except for one complaint. Parsec can pass thru audio output just fine, but currently does not support mic input. This is fine for most of the games I play except for Warzone, and since my friends are playing on console they can’t use something like Discord for comms. The workaround proposed by Parsec is super janky as it consists of using two Discord accounts, a audio driver to loop back audio out to audio in, and having to subject your friends to incredibly annoying echo’s while talking. I tried this out and it did not bode well so I went back to just having to point to stuff in game, which makes communicating a lot harder in a game where good communication is essential. Overall, I’m still really impressed by Parsec and will continue to use it while I’m away from my desk.

macOS Mojave on a 2009 HP Laptop – The Return of the World’s Worst “Hackintosh”

You may remember that I installed macOS Sierra on my 2009 HP Pavillion dv6-2000t as referenced by this blog post: https://thestackunderflowblog.wordpress.com/2017/07/07/installing-macos-sierra-on-a-2009-hp-pavilion-laptop/. Since then I had to reinstall Windows 10 for use as a Plex server for a while. A few months later I was able to grab an old SFF Dell desktop with a 3rd gen i5 to take over Plex duties, leading my HP to retirement yet again. At this point I figured I should just recycle the computer since I don’t really have much use for it, until earlier this week. I decided to dig it up to give it one more shot at life, with one more shot at a working macOS install.

My Sierra install was pretty smooth except for one jarring issue: lack of proper CPU management. I don’t think I was getting the full performance out of the first gen Core i7-720QM inside of macOS and I never could figure out why. Actually, I am still not sure if I’m getting the full performance in Mojave but so far it seems faster than my Sierra install so I’ll mark that as a win in my book. Now you might ask, why not Catalina instead of Mojave? Simply put I just couldn’t get the Catalina installer to boot, so maybe my hardware is just too old. I actually like Mojave better since it has less of the annoyances that were introduced in Catalina. Overall the install process was pretty similar, with one headache this time being graphics. With my Sierra install I just used the NvidiaInjector in Clover to inject the native macOS NVIDIA drivers since I have a GeForce 2xx graphics card. Now, I thought this *should* have worked in Mojave despite the end of NVIDIA driver support in High Sierra, but there is one catch: only Web Drivers stopped working after HS, the built-in native ones for GeForce 2xx-6xx should still work. Despite this, I was still struggling to get it to work only to realize that the microarchitecture of the 200 series chipset doesn’t support Metal which is now the default graphics layer used by the macOS window server. Yet somehow, through some wonky patched drivers, I have some graphics acceleration working in Mojave with my GeForce GT 230M, at least enough to run the laptop display at its native resolution.

Honestly, I am shocked this system works at all. It is using incredibly outdated hardware and yet here I am, typing this blog post up on an 11 year computer running Apple’s second to most recent Mac operating system. Here is my. baseline benchmark for my primary workload, cross-compiling apps with Xamarin in Visual Studio for Mac. The test consists of a freshly created blank Xamarin Forms app targeting Android API v28 and iOS SDK 11.1. Here are the results:

HP Pavillion dv6-2000 (i7-720QM, 4 GB RAM, macOS 10.14.6)02:44.58
2018 15″ MacBook Pro (i7-8750H, 16GB RAM, macOS 10.15.3)00:17.74
Blank Xamarin Forms app compile times

The worlds worst Hackintosh takes almost 3 minutes to compile a blank Xamarin app compared to just under 18 seconds on a 2018 15″ MacBook Pro with a Core i7. Of course, we do have more cores at a higher clock speed, but there are clearly improvements to microarchitecture here as well. You would hope there would be such a divide, considering there is a 9 year age gap between these two machines.