Day: May 28, 2025

  • Inequality in India is Not Just in Income — There is a Bigger Gap to be Met

    The “Skilling Smarter” report reveals a stark truth: while nearly every organization claims to value learning, only 1 in 4 workplaces has a formal skilling strategy. Half of all professionals received zero training in FY24–25. What we’re seeing isn’t just a learning gap—it’s a strategic failure that could trigger long-term risks few are prepared for.

    Read the Report by UpGrad

    The Cost of Misaligned Skilling

    Today’s workforce spans three generations, each with its own learning style. Yet 63% of HR leaders admit they don’t tailor training by cohort. That’s like using the same playbook for tennis, cricket, and chess—bound to fail in at least two arenas.

    The consequences are already visible:

    • 75% of skilling participation is driven by mandates, not motivation.
    • 51% of employees say the content isn’t relevant.
    • 61% of CHROs report no measurable ROI.

    But the deeper issue lies in a dangerous disconnect: while employers focus on technical and compliance training, employees are hungry for soft skills, strategic thinking, and personal growth. This mismatch signals more than poor engagement—it hints at future waves of quiet quitting, stagnation, and attrition.

    Second-Order Effects: What Happens If We Don’t Act

    1. Rise of Learning Inequality: Just as income inequality widens societal gaps, “learning inequality” will widen skill gaps inside companies. Those who get training will keep growing; the rest risk becoming obsolete.
    2. Innovation Bottlenecks: With no training in creativity or strategy, businesses risk raising a generation of technically sound but vision-starved talent—precisely the opposite of what AI-era leadership demands.
    3. Invisible Attrition: Employees may stay on the payroll but mentally check out. When skilling feels irrelevant, optional, or inaccessible, disengagement becomes the default.

    It’s time to move from mass-produced learning to mass-personalized growth.

    Three Creative Ideas to Rethink Skilling

    1. The “Skills Studio” Model

    Borrowing from design thinking, set up cross-functional “Skills Studios” where employees solve real-world business challenges using new skills. Learning is not separate from work—it is the work. Studios can rotate monthly across functions (e.g., Sales Studio, AI Studio, ESG Studio), creating a culture of experimentation and applied learning.

    2. Skilling as Currency

    Create an internal “SkillCoin” system: employees earn credits for learning, mentoring, or teaching others. Credits can be redeemed for project opportunities, shadowing stints with leaders, or even additional leave days. This gamifies learning and embeds it in the employee value proposition.

    3. Gen-Specific Learning Playbooks

    Ditch one-size-fits-all LMS platforms. Instead, build modular learning playbooks tailored to Gen X, Y, and Z. Gen X might prefer on-demand case studies and expert talks; Gen Y thrives on structured micro-courses and peer-led sessions; Gen Z wants immersive, mobile-first experiences. Let each generation own their journey in a way that suits them best.


    In the age of Gen AI, skilling isn’t just about surviving disruption—it’s about owning it. The most valuable asset in any organization isn’t its technology. It’s the speed at which its people can reinvent themselves. Let’s start skilling smarter—not just more.

    #career #inequality #skillsgap #upgrad #2025

    Thank you in many languages
    Thanks a lot in many languages
  • Sovereign AI: Why Control Over Culture Matters to Every Nation

    Sovereign AI: Why Control Over Culture Matters to Every Nation

    Sovereign AI

    The Danger of a Single Story

    Imagine if your entire country had to rely on another country’s oil to keep the lights on. Or imagine if every movie, song, or textbook you had access to came only from one or two countries—and you had no control over what stories were told, what values were shared, or how your people were portrayed.

    That’s kind of what’s happening right now with Artificial Intelligence (AI). Some countries are no longer okay with just using AI models built in the US or China. They want their own. Not just to “use” AI—but to own the tools that shape how AI thinks, what it says, and who it serves.

    Let’s take France as an example where the choice to build sovereign AI is deeply rooted in their specific cultural context, not just economic or security concerns.

    Why must English be the dominant language for AI?

    There are words in many languages that have no equivalent in any other language. Uncle and Aunt would fail to explain what bua, chacha, fufa, maasi, would mean. Ask anyone who speaks Hindi. There are words in Bengali that cannot be translated eg nyaeka or gaaye pawra.

    Imagine a French student asking an American-trained AI, “What makes French culture unique?” The answer might include food, art, and history—but subtle, culturally loaded ideas (like the role of laïcité or the importance of the French Republic’s civic values) may be underrepresented or misinterpreted, simply because the training data doesn’t reflect them deeply or accurately.

    Read about words that have no equivalence in English

    Cultural Tensions That Influence This:

    • Language Preservation: France has strict laws (like the Toubon Law) mandating the use of French in public life, including advertising and education. An AI model trained mostly on English content undermines this priority.
    • View of Secularism (Laïcité): The French conception of secularism is unique and central to its national identity. An AI trained in the U.S. may reflect a more pluralistic or religiously expressive worldview, clashing with French norms.
    • Data Sovereignty: France (and the EU) see data as part of national identity and autonomy. Using foreign AI models is seen not just as a technical dependency—but as a cultural compromise.

    What They’re Doing:

    France, along with other European nations, is investing in building its own European LLMs (like Mistral, a French AI startup) that are:

    • Trained on multilingual, culturally diverse data sets.
    • Designed to follow EU laws and ethical standards.
    • Intended to preserve cultural nuances and national values.

    In this case, France’s move isn’t just about economic competitiveness or digital sovereignty. It’s a cultural decision: to make sure French identity is not diluted or misrepresented by AI systems built elsewhere.

    Check out this talk by Chimamanda Adichie.


    From Oil Factories to AI Factories

    Back in the Industrial Revolution, oil was power. If you had oil, you could run factories, cars, and armies. Today, data centers—the massive buildings full of computers that run AI—are becoming the “oil fields” of this generation. But instead of drilling for oil, we’re training large language models (LLMs) like ChatGPT.

    That’s why places like Saudi Arabia are no longer calling them “cloud providers” (places you just rent storage from). They’re calling them AI factories—places that create intelligence. And just like countries once competed to control oil, now they’re racing to build and control their own AI.


    Why It’s Not Just About Technology

    Here’s where it gets interesting.

    AI isn’t just about tech or gadgets. It’s about culture, values, and control over what people see and believe.

    Think of an AI model like a super-intelligent librarian. It can answer questions, explain ideas, and even give advice. But what if all the books in that librarian’s head came from just one country, with one way of thinking? That means the answers it gives are shaped by a single worldview.

    Now imagine that librarian starts working in your school, in your government, or on your news platform. You’re not just outsourcing knowledge—you’re outsourcing your voice.

    That’s why these models are called cultural infrastructure. They don’t just compute—they shape the stories we tell ourselves as humans.


    The Vulnerability of Outsourcing Intelligence

    Here’s a metaphor: Say your country is playing in the Olympics, but you don’t train your own athletes—you just borrow athletes from another country. What happens if that country becomes your rival? Or stops letting you use their talent?

    That’s the risk with using AI from other countries—especially ones that might be politically or ideologically different.

    It’s even harder when you don’t know exactly how their AI was trained. What data went into it? What biases are baked in? If you don’t control the training, you can’t be sure what values the AI will promote—or hide.

    This is why some experts call this the beginning of LLM diplomacy—a new kind of global relationship built not on weapons or trade, but on who builds and owns the “thinking machines.”


    So What Should Countries Do?

    Countries now face a choice:

    1. Build their own AI models – like Saudi Arabia’s “Humane” initiative. This gives them full control but requires massive investment in talent, data, and hardware.
    2. Partner with others – teaming up with friendly nations or companies to co-develop AI. This is cheaper but still means some compromise on control.
    3. Do nothing and use existing models – easy and fast, but potentially risky if you lose cultural or strategic autonomy.

    The Bigger Picture: It’s About Who Gets to Shape the Future

    This shift might sound distant to some. But imagine if your college textbooks, search results, and job advice all came from an AI trained in another country—one that didn’t understand your history, your humor, or your values.

    AI is becoming the lens through which we see the world. That’s why countries don’t want to just rent AI—they want to build it, shape it, and own it.

    Because in the future, owning the “mind” of the machine may matter as much as owning land, oil, or weapons did in the past.

    That’s the world we’re stepping into: where sovereign AI—homegrown intelligence—is the next big power move. One size does not fit all. We don’t want a world to have just a single story to listen.

    Listen to this podcast