Story

2024: The Year AI Leapt Into Your Laptop and Changed Everything

By Julian Wheatland

Adolescent AI

2024 was a landmark year for IT and communications and will go down in history alongside 1984 as a year to remember: not because 2024 has Orwellian implications, but because in 2024 personal computing took a leap forward comparable to the one it took in 1984.

In January 1984 Apple launched the Apple Macintosh. In doing so it introduced the graphical user interface (aka the GUI) and the computer mouse, triggering the personal computer revolution which, over the last 40 years, has transformed the way that we work and the way that we live. Technology discontinuities don’t come along often, but in 2024, almost unnoticed, a step change in personal computing happened that rivals the launch of the Mac. It will radically alter our work and personal lives in ways that we can barely imagine, and its impact will have far reaching effects for decades to come.

So, what was this tech development of cosmic importance? It was the news this year that all of the major laptop manufacturers are incorporating AI chips onto their device motherboards.

I imagine many readers’ eyes glazed over when they read the last paragraph. What’s so great about another microchip? Why is he getting so excited about it? I could bore you by reciting all the technical differences between GPUs, FPGAs and ASICs, and how they differ from traditional CPUs, but that would be for a different article. What I will tell you, however, is why it matters.

Ever since late 2022 when ChatGPT came onto the world stage and the power of artificial intelligence (AI) and large language models (LLMs) became apparent, there has been an excited buzz (sometimes scary and sometimes optimistic) about the impact of AI and how it is going to change the world. However, there was a problem: running AI models requires a huge amount of computing power and, up until now, that level of computing power has only been available in the cloud. This matters because, in order to access the power of applications that run AI models, it’s necessary to send large amounts of data up into the cloud, have it processed on shared servers, and then send the results back to a local device. Not only is this slow, but it also presents major security risks through the sharing of confidential data with 3rd party service providers.

These latency and security vulnerabilities have constrained the development of AI and would have continued to do so if the built-in AI chip hadn’t appeared on the scene this year. Up until now, in spite of its promise, generative AI has been in its infancy; constrained to being little more than a toy – a party trick to chat with a robot or cheat on an exam. Serious applications that can, in real time, analyse and process large quantities of customer, operational or personal data, have been laggy and treated warily by users. Attempts to embed these models into larger platforms has been challenging for developers because adoption is hampered by corporate security concerns and an inferior user experience.

In 2024 that changed. By building in AI chips to personal computer motherboards, AI models can now be run locally on the user’s device. It removes the security concerns associated with transmitting, often sizeable, quantities of confidential data outside the organisation or the personal domain; and it removes that compromising time lag which is detrimental, not only to user experience but also to application efficacy.

As of 2024 software developers across the globe are set free to incorporate artificial intelligence into new and existing applications that will truly tap into the promise of AI. This will put power in the hands of users that was inconceivable just a few years ago: ultimately, the power of being able to draw on the knowledge of everything known by everyone, ever!

Don’t just take my word for it. This is the message coming from the mouths of the CEOs of Microsoft, Google, HP, Intel and Nvidia to name just a few. And they’re feeling the impact on their businesses: as demand moves from traditional CPUs to advanced AI chips, Intel is struggling to keep foundries open and Nvidia is struggling to meet demand.

At Progility we’re already seeing the effect at the other end of the supply chain. Real time AI functionality is fast appearing across all of our technology areas, from network management to cybersecurity. AI enabled features available today include services such as predictive fault anticipation, automated customer service, real-time video analytics, and network access anomaly detection.

We are still in the early days of AI, but this year it moved from infancy to adolescence. In the future AI will be embedded in almost every device you touch. And it was 2024 that made it possible.

 

(The author is Mr. Julian Wheatland, CEO , Progility Technologies, and the views expressed in this article are his own)

About Progility Technologies: https://www.progilitytech.com/

Progility Technologies Pvt Ltd is an established multi-domain solution provider and IT & digital systems integrator in India. Progility Technologies focuses on customised solutions to improve business and productivity across a wide variety of organisations.

Progility Technologies has a robust portfolio of solutions that excels in the development, implementation, and management of unified communications solutions, data network infrastructure, and video conferencing and security solutions.

Progility serves clientele across industries spanning banking, financial services, insurance, government, healthcare, IT/ITES, manufacturing and automotive. With a national footprint in markets throughout India, Progility offers a diverse array of services and advanced solutions to cater to the ever-changing technology needs of modern businesses.