Designing for the Long Term: Head-to-Head

Share This Post



As organisations race to adopt generative AI, the real differentiator is no longer access to tools, but the skills, judgement and ethics required to use them well. Josh Meier, Senior Generative AI Author at Pluralsight, brings a practitioner’s perspective on what it takes to build AI-capable teams, rethink expertise, and prepare both leaders and learners for an augmented future of work.

By 2026, what will distinguish teams that work well with AI from those that merely use it?

“In 2026, organisations that see returns will be those investing not just in AI tools, but in the complementary skills needed to use them effectively.

“Already 95% of UK employers say tech upskilling is a priority, but 50% of employees lack the time and 93% lack the support from managers to actually upskill. As a result, nearly half (48%) of AI initiatives have already been abandoned due to skills gaps.

“As AI investments are increasingly scrutinised for their value, manifested in the recent AI bubble conversation, 2026 will be the year where true AI skills determine ROI.” 

“The teams that will be most successful with AI are the ones who integrate it into their workflows, identifying where AI can be best used to streamline choke points. Knowledge of when AI is the best tool for the job, and domain specific knowledge of workflows are what will dictate success with AI tools.”

How does augmented work change the definition of competence and expertise?

“In the past, seniority equated greater knowledge and expertise. But now, in the AI era, there is a clear generational divide in AI knowledge that does not correlate with job title or tenure. 

“Younger professionals are more familiar with AI: millennials are 1.4 times more likely than other age groups to be extensively familiar with GenAI tools and 1.2 times more likely to expecting workflow changes within a year. In stark contrast, 91% of C-suite executives, the most senior in any organisation, admit to exaggerating their AI knowledge.

“AI is a relatively new tool, and many members of today’s workforce have not had the opportunity to engage with it in an academic or structured way, unlike younger generations. The idea of what competence and expertise are haven’t changed; they just now require more of an understanding of AI tools and integrations into existing processes for the domains. Because these technologies have emerged rapidly over the past few years, professionals who are unwilling to explore or engage with them risk seeing their expertise lose relevance.”

“We need to ensure businesses upskill employees across all levels so there is not a saturation of AI knowledge among younger workers and a depletion at the top levels. Levelling the playing field in this way will ensure everyone can contribute towards AI innovation goals.”

How should education systems prepare people for work that doesn’t yet have clear boundaries or titles?

“Although we can’t be sure what roles will exist in the future, we can be confident that today’s students will enter a working world that requires them to use AI – from admin tasks to idea generation to deep research. 

“If we don’t start embedding tools like ChatGPT into students’ academic lives now, teaching them to use AI in a way that is transferrable to work, Gen Alpha will arrive at the workplace unprepared. Incorporating AI into learning teaches students foundational ways of working that will serve them in whatever industry they choose. This includes how to prompt LLMs, scrutinise AI outputs and AI as work assistants, rather than replacements for thinking.

“Foundational knowledge will always be applicable, so integrating AI into high level education in proper ways can give students a better understanding of how to work with AI models, their capabilities, and their limitations. LLMs themselves may change, but the skill of how to logically parse a response or output will always be important, just like it was before LLMs.”

What risks do organisations face if they train people to use AI without teaching them how to challenge or contextualise it?

“Access to instant information from AI tools does not eliminate the need for critical thinking, it augments it. With information right at our fingertips, users must develop the skills to question, verify, and contextualise the answers they receive from AI. 

“Teaching digital literacy today means teaching how to think critically about AI -understanding bias in training data, recognising misinformation and using sound judgment when interpreting results, and always verifying outputs. Users need to understand that they can’t simply trust everything AI gives them. Blind trust in AI risks biased information, factual inaccuracies and poor decision making. 

“That’s why AI skills and digital ethics must be taught together – from responsible prompting to data privacy. The goal isn’t just to teach people how to use AI, but how to use it wisely.” 

How can learning programmes remain relevant when the tools they teach may change within months?

“The government’s investments, from TechYouth to adult reskilling programmes, reflect a growing recognition that AI skills must be developed at all stages of life, forming the foundation of an AI learning arc.  

“But realising this vision requires a shift in how we think about learning programmes entirely. Learning can no longer be seen as something frontloaded in youth, it must become a continuous, lifelong commitment. In a world where digital skills can become obsolete in just a few years, ongoing education is essential. In fact, the World Economic Forum predicts that by 2030 more than half of today’s workforce will need reskilling due to the rapid pace of technological change. 

“Building the AI learning arc means weaving AI literacy throughout the entire educational and workplace journey. This could start with simple tutorials about AI and introductions to tools in primary school, expanding into project-based learning in secondary school, moving into introductory development and machine learning courses in higher education, and then ongoing skills-based learning in the professional world. 

“We can already see international models developing this approach in higher education. In the U.S.Ohio State University, for example, launched an ‘AI fluency initiative’. This is a commitment to integrate AI tools into all undergraduate courses, ensuring that students know how to apply AI tools to their respective fields. This provides students with a foundational understanding of digital tools that can be built upon as technologies continue to change.”

“Ensuring focus on foundational skills such as critical thinking, or teaching proper mental checks will help ensure users of AI infrastructure understand what to ask and when to ask it.”

What mindset shifts must leaders make to support long-term learning in augmented environments?

“To lead with confidence in the AI era, leaders must build foundational AI knowledge for themselves. This is vital if they’re going to make strategically sound investments around AI and lead AI-native teams. This doesn’t mean learning to code; it means understanding what AI tools can do, how to apply them strategically and where the risks lie.  

“Leadership need to change their mindset so that upskilling is not treated as a one-off initiative. It must be continuous, embedded into workflows, and adapted to how people learn today: through short-form content, on-demand platforms and real-world application.” 

“Workers who don’t have the skills to navigate an AI-enabled world become less valuable every year. On the other hand, the money it takes to train employees for the new world of work will always pay dividends for the company. Hiring new people with no understanding of your systems but with the skills you need means they require years to begin to match what someone who has already been there could have done with a few weeks or months of training. The ROI of training in-house employees for your needs is much higher than finding new employees in almost all cases.”

Josh Meier, Senior Generative AI Author, Pluralsight.

Josh Meier is the Senior Generative AI Author at Pluralsight, developing courses on generative AI, with a particular focus on the ethical use and development of AI systems. Josh has previously worked in software engineering and holds a Master’s degree in Artificial Intelligence. Josh is passionate about AI and the ethics surrounding its use and creation, and has honed skills in generative AI models, ethics and applications.



Source link

spot_img

Related Posts

spot_img