State of Software: 2022
State of Software in 2022, from a developer’s point of view.
The state of software is incredibly in flux in 2022.
Don’t get me wrong, hardware is changing rapidly too, but hardware has been abstracted to the point where totally new technologies like new CPU, disk and memory techonologies usually mostly improve the software experience vastly over their predecessors with new speed and almost unlimited capabilites. And other devices (GPUs, headphones, cameras, GPS location, etc.) are all easily accessed by any software once the correct drivers are loaded – few things are tied to specific hardware anymore.
But software has undergone many transformations, some brilliant and some unsettling.
Most applications are now fully or at least partly on the web, and even some locally executing applications use web-styled technologies (HTML5, JavaScript) for user interface, scripting and file formats. These are good moves, as they are moving toward platform agnostic strategies, enabling users to move between computers of different brands and technologies, phones, tablets, ebooks, and assistive devices for special needs individuals like screen readers and zoomed displays, and work remotely during pandemics, travel, etc.
Where this is particularly common is enterprise software, where adopting the web technologies has allowed cloud providers to offer scalable solutions and coincidentally turn incredible profits. Workday, Adobe, Google and many others have deprecated or eliminated local options and pushed everyone to use their cloud services on subscriptions.
The holdouts (in commercial software) are largely specialized software such as scientific and engineering applications. They are still developing to the bulky PC model where you benefit from a beast of a computer. These companies have an enormous investment in software of this design, a captive market, and little reason to want to change.
Some vendors (like Matlab and Microsoft) offer weaker versions in the cloud, but currently require desktop or hybrid versions for the premium experience. Microsoft intends to change that for Windows 12, but today everything from Office, Teams to their business intelligence software usually work better and with more data when running natively local, while there is a weaker web-only version for casual use.
There is also the guaranteed income that comes from subscriptions which have to be renewed yearly – gone are the days you buy once and use for years, though one seriously wonders which of this month’s word processor or spreadsheet improvements you actually need.
Many issues that frustrate developers are actually liberating for users. It is challenging as a developer to not know the width and height of the screen when you are creating software. If it’s a PC, you can usually assume 800x600 pixels minimum, or more for a modern 4K or 8K screen, but for a phone those numbers are vastly different and change if the user rotates the phone. So, we use responsive design, a strategy where the screen dimensions and the application layout is not fully decided until “run-time”, meaning we only give a strategy for the application layout, and the system doesn’t know the exact layout until when the user actually views the screen. Turn your phone sideways and watch the software react to the updated dimensions.
The current level of standardization could not have happened years ago. Previously the browsers were vastly incompatible with each other and with their previous versions, and every possible previous version was in use on someone’s desktop. It was challenging even to do relatively simple web applications that worked across platforms let alone between brands of browsers. Now that browsers force updates to new releases, and Internet Explorer is dying off, things have been much better. In fact, now most browsers do even challenging tasks like allow access to cameras and video streams used for video conferencing apps with minimal coding.
Some Web extensions are not fully implemented quite so universally, like voice recognition, but each new browser version seems to add to the compatibility with such standards.
JavaScript is executing quicker now than before and “Service Workers” can make use of extra cores in your CPU. Also, Web Assembly has introduced raw speed to the web browser. While it is more challenging to program, it offers near native application speed. The first apps to use it are largely games, but also business uses like encryption, and 3D rendering for visualization and animations.
A huge boon has been the advent of the Single Web Page design strategy, where you do not load new pages at every turn. Instead, the page’s JavaScript updates the screen immediately like a local application would, and only relies on the server to access data, not to redraw the whole screen. This strategy is what gives Gmail, Facebook and others their decent performance compared to the click and wait web pages of older designs, and increasingly it is used in standard web applications.
What especially benefits the business community is that web applications can be kept current. Change the web page and everybody now uses the same current version of the software. With classic workstation applications that level of synchronization is not possible.
We have also standardized on a dozen reasonable technologies behind the scenes, from databases, to OAuth and Rest and similar protocols which allow one to build solutions out of Lego-like components.
These, in my opinion, are the general good news about software. Now the bad.
The flux means that there is a lot of uncertainty, and most developers no longer understand the whole “stack”, meaning all the software from top to bottom. Also many things change underfoot.
In the ancient 1980’s, software would be updated every few years, operating systems updated every few years and antivirus definitions were sent on floppies every few months. Now software and OS updates occur almost daily, antivirus definitions several times daily for some brands. The Internet means you are constantly at risk of new attacks, and the internal code developers depend on changes underfoot and cannot always be tested. Much stability is gone. You no longer can say you’ve tested on a similar configuration, because everything is in flux.
Some vendors, some huge, no longer test their software manually, or with target test groups. Instead, they encourage early users and employ automated VMs to look for problems and assume there are none if few complain.
How bad it is? Between bugs in the two largest OS vendors, most networked users could not reliably and securely print for months during the pandemic – luckily most users were offsite from employers and did not notice these flaws. Workgroup network printing was “solved” in the PC Lan Program 40 years ago, but for most of 2021 was still a mess due to network security and compatibility issues that plagued these two huge companies.
Users of popular video conferencing apps, spreadsheets and word processors now find there is a new version almost every time they start the application thanks to continuous improvement. Things that failed early now work, but also known features sometimes now fail or follow different menus and buttons than before. User documentation cannot keep pace with the unexpected changes, frustrating IT staff and many users alike. Again, stability has been lost.
The underpinnings of software, the operating system and many utilities and libraries are also changing underfoot, leading to strange and unpredictable results at times.
Another factor about cloud computing is that IT teams have lost more control. At UW we would schedule most updates around term boundaries to give continuity to our professors and students for each full term, and avoid updates to systems like Quest at peak periods, avoiding updates to financial systems at fiscal year end (except the year the accounting system was shut down for two weeks for an upgrade). But now we are at the mercy of our cloud providers who do not conform to our timetables, especially since they are dealing with many organizations all with different schedules. Downtime still occurs, and remediation is not something we can usually control anymore.
Users have largely benefitted from the webification of software, but there are some downsides to acknowledge.