Why the Digital Divide Keeps Getting Wider

The digital divide is an economic, social, and technological gap between different demographics and sectors of society. The digital divide boils down to a lack of access to modern computing in some form or another. There are many factors that contribute to the digital divide; poverty, education, and poor infrastructures are just some of them.

The digital divide is also a source of inequalities that may take the form of discrimination on the basis of race, gender, or socioeconomic status. While the digital divide is often thought of as something which only plagues “poorer” nations, it’s also evident in developed nations with high levels of inequality. Silicon Valley “tech bros” don’t understand why someone would want offline modes while other areas feel like time capsules from years past for technology.

The digital divide isn’t a phenomenon which has disappeared with the advent of cheap, affordable broadband and cellular access in more and more places. Technology continues to grow and develop, and that progress just keeps moving the goalposts. There is still very much a digital divide and the gap continues to get wider and wider for both individuals and businesses. The only difference is what falls in the divide.

The Traditional Digital Divide

Phones were the original technological schism which defined what came to be known as the digital divide. At one point, phone access was a luxury in the US (and many other nations). Some people resorted to barbed wire phones and similar, but there’s a difference between what keeps up with the local economy, what is competitive regionally or even nationally, and what is competitive globally.

The last mile stayed relevant into the early 2000s (in the US). While electricity and running water are taken for granted in most areas in the US, different regions took longer than others. Telephones, internet, etc. have taken even longer to reach various places. Technology is iterative, so to get fast internet, you need power, you need coax (or better), and you need equipment to run it.

In the 90s and early 2000s, the digital divide was the ability to access the internet, which grew into the ability to access broadband. There were large demographic gaps between who could access the internet and who couldn’t. There were divisions between race, but also region in the United States. Rural areas still have difficulty getting fast internet access or what even passes as “acceptable” internet. There are regions in the US today where the only options are satellite (if that).

Goalposts move as the web adapts to the average expectation. The average just keeps moving because the extremes have inflated. There are fewer with absolutely no internet access, but the original digital divide is very much still alive.

Where We Are Now

The modern digital divide is hard to quantify. Internet access is arguably near ubiquitous, but how do we measure usable internet? It gets a lot trickier to measure what you need to survive on the modern internet.

You have to consider the implications of the Internet of Things, Edge Computing, Ambient Computing, etc. Every layer of complexity on the internet just drives a wedge in the gap, rather it’s devices or complexity with Javascript and other programming languages. A computer from the past few years isn’t the same as a computer from this quarter.

Buying a computer has gotten far more caveat emptor with the chip shortage and the varying levels of technical availability with the digital divide. One of my laptops is fine with 8GB of RAM, but plug in an external monitor and you can push the limits in no time. I’m certainly not running anything like GPT-J.

Computing isn’t as “one-dimensional” as it used to be. CPU, RAM, disk space, etc. are all factors for modern computing tasks. Disk space still has 16GB eMMCs available next to laptops with both an SSD and a HDD. RAM runs from 2GB to 64GB. Processors range from 2 cores and threads to octa-core with SMT (simultaneous multithreading).

The digital divide isn’t just having access to the internet (though that’s still a real problem), it’s having access to the modern internet without gaps. It’s one thing to be able to Google basic data, it’s another to be able to stream video, and it’s yet another to be able to create it. Sadly, some people are still unable to do any of these even in the United States.

The Digital Divide for Resources

The digital divide is more about resources now than before. You can’t run modern machine learning based technologies like GPT without the right resources. It’s one thing to view a site, it’s another to run a modern machine learning system.

Most laptops are still capped around 8GB (for most cheaper offerings), 12GB, or 16GB for RAM (at the higher end). It’s hard to get a graphics card even now. Things should change as the chip shortage ends, but even then, technology has stalled. The haves and the have-nots have gotten more and more divided.

A lot of technology is available with a cell phone, but the cutting edge is entirely cut off. A Chromebook or a phone get you the internet, but not necessarily modern technology without extra steps and a subscription.

It takes time and money to have the ability to compute online. I use multiple servers to stay relevant at my job, but many people don’t have that option. Running a system around the clock requires more than just the power or knowledge to get it going; you need a purpose. Without the right ability to “dip your toes into the water” it can seem impossible to have a reason to keep going. It’s easy to drop out of the race when you don’t know where you’re supposed to go, or why.

What resources are available or practical impacts how people use technology. You can log into every website or you can make websites. One takes a browser, the other takes the infrastructure and know-how to support it. Not everyone gets the same opportunities for this either.

The Digital Divide for Businesses

The digital divide affects individuals and families, but it also affects businesses. A business in a remote setting is going to (probably) have slower internet and lesser hardware than a company in a tech center or tech-centric business. You spend most of your waking hours at work or similar, so this has a massive impact on personal development (as well as the profitability of a business).

A slow computer means less work gets done in a given amount of time for most cases for white collar jobs. Excel being slow is fine for the receptionist, but can make you less competitive as a business or as an employee for a financial analyst. Developers have even more specific requirements.

Graphics cards are still in short supply. RAM has a shortage as well. Processors and similar are finally competitive thanks to the battle between AMD and Intel, but computing had basically been treading water for almost a decade since AMD wasn’t competitive (until recently), so Intel made very few real innovations.

I was using a literally 10 year old machine before it failed because virtually nothing beat the raw benchmarks at a sane price-point. You got a minor improvement with each generation of Xeon or i series processor, but nothing really felt like the old days of going from a P4 to a Core Duo or Core 2 Duo. A good processor from a few years ago is better than a middle-of-the-road processor today. RAM grew then stagnated. The big migration from 4GB to 8GB happened when processors moved from 32-bit Windows to 64-bit. A lot of people are still buying machines with only 8GB of RAM.

Going Beyond Computing

Some businesses have kept up, but most smaller ones just buy a machine which is around $300-600 (unless they need specific resources). Technology companies may spring for more, but usually only in specific situations. While this may seem a bit low, it’s what I’ve seen with a large number of small and medium businesses ranging almost every industry imaginable.

Computing power isn’t the only thing businesses skip on in the increasingly risk averse markets; they skip on many digital or technological solutions or otherwise cheap out. Firewalls, cloud products, network architecture, servers or similar, etc. are all things which businesses tend to reduce spending on more than they should from a security standpoint.

That said, some companies tend to focus on these tools for success and profit heavily. An agile (not necessarily the framework) business which has more computing resources at their disposal is going to be more successful. The digital divide which affects individuals can spill over into businesses due to education and geographic location, but income can still remain a factor.

The distance between the front and the average has increased a lot. Technology is more ubiquitous and more is being spent on it, but the cost to ante in the modern world is higher. You can’t just focus on a few aspects either, you have infrastructure which is more complex, software costs, security costs, hardware costs, etc. with new requirements each year. You used to not have to worry about compliance locally, but now you need to be aware of GDPR, CCPA, etc. because a single wrong visit can potentially create a lawsuit or incur fines.

On the Consumer Side

The consumer world doesn’t feel much different either. There are more and more costs to stay ahead in technology in the home, and only some of this actually carries over to the work world.

Home automation is big, SaaS and similar solutions are more and more common, and subscription services continue to grow. Hardware is harder to source with the ongoing chip and graphics card shortages. To get even more out there, things like 3D printing, advanced 2D printing, and other creative endeavors have a technical buy-in. There is also a learning curve for all of the new changes which is hitting some people harder than others.

You can’t just spend money on conquering the digital divide, there is a time and education requirement as well. Technology has gotten easier to consume, but has gotten much harder to actually use to its full potential. The barrier to entry for technology has never been lower, but the barrier to understanding just keeps getting more complex. Computing has grown in scope and breadth, but shrunk in access requirements for the lower end.

Anyone can use an app or similar, but there is far less understanding of how the increasingly complex machine goes together. This isn’t a complaint so much as a statement of acceptance of the inevitable marching and diversification of technology. A single person could be a doctor for all medicine at one point, now specialists may diversify by the subtype of a single condition. There’s a similar level of information (depending on how far back we go), but what that information gets you has changed.

Why It Matters

The digital divide as originally envisioned has changed substantially in the past decades. While the older factors remain in play at lower levels, there are countless new factors. It’s not enough to just have a computer and an internet connection, you need the resources to consume and create to stay competitive. The trend of change isn’t necessarily bad, but the result which has manifested is.

The barrier to entry was higher in the 90s and early 2000s, but it took less to be able to both consume and create digitally. There were fewer options, but each was more expensive. A computer and a connection were relatively expensive buy-ins, but they were all you really needed. Basically, there are so many facets that even though each is cheaper and easier than before, they get more complex and more expensive by sheer volume.

What it takes to handle basic computing and what makes up the necessary computing knowledge to stay ahead has continued to grow exponentially. The technology itself is cheaper, but the requirements to make it all work have gotten more and more complex. It’s one thing to use a machine learning model, it’s another to train it. Companies don’t want someone who can’t “hit the ground running” anymore either.

To bridge the digital divide, there are more moving pieces than before. We aren’t just facing a lack of internet access, but a lack of meaningful technological knowledge. Knowing Excel or similar used to get you a job, now it just sets you above another candidate, maybe. The digital divide has grown and the only way to bridge it is enough specialization to not be locked down (as ironic as the concept is). It’s a social problem, and an economic one which requires deeper insight and understanding to truly resolve.

Image by Peter Olexa from Pixabay