Thursday, April 30, 2026

Too much too soon? Lessons from the Wind Industry

Wind turbines became popular and underwent rapid evolution during the 1990s and early 2000s, fueled by climate change concerns that motivated exploration into alternate, clean sources of energy. Governments recognized it and incentivized investments. The technology had existed before, but had potential for massive upgrade to achieve much higher efficiencies. Engineering and R&D in this area got a boost. Every couple of years companies started launching new turbines with higher MW ratings.

For a wind turbine to generate maximum energy, it has to be placed on a site with good wind conditions. In that, all places are not equal. For example, in India, there are places where wind profiles are strong, and others where they are weak. Also, wind turbines need to be in relatively uninhabited places because they generate sounds at frequencies that are not healthy. They need to be spaced apart adequately so that they don't interfere with each other. They are massive, and are installed in large numbers to develop what are called 'wind farms'. They must stay put and be operational for 20+ years for the investments to make sense.

Now you want to place your best machines in the best places. But the best machine is best only at a point in time. While the best place is always the best. And this is where the most logical approach is also a bit silly. The best places are taken up pretty early with machines that were best at that time. However, in an emerging technology-intensive industry that is getting a lot of attention, there's constant race to develop better machines. And each upgrade makes the earlier one seem like a toy. But the irony is that the toy got the better playground. Since the best wind sites are taken, the new, better machines have to settle for poor sites.

The industry tried to mitigate this in two ways. (1) Move offshore, to find great wind conditions - but offshore has its own challenges, (2) Develop turbines for poor wind conditions - which is like developing cars for poor roads as the highways are inaccessible.

As the industry matures, the turbines in great sites get old and up for replacement. And at that point, you can put your new ones in the best places again.

It's a wave, and there's little you can do to beat it, especially in a fiercely competitive market.

Now draw parallels and think about investments in AI based solutions. In a scramble to get onboard are companies investing too much too soon? Are companies creating legacy issues for themselves by jumping to adopt while the technology and its capabilities are evolving too fast? Is it FOMO or smart investing for efficiency and value? May be the truth lies in between.

Most AI startup ideas I came across a couple of years ago based on the shape of LLM's back then are obsolete now. It's probably how any new tech waves emerge and then gradually stabilize - short and quick early on, long and slow as they wane.

Originally posted on LinkedIn on 30 April 2026

Tuesday, April 28, 2026

Circle of Life

My father worked as a stenographer and typist in the Indian Railways, and he was exceptionally good at it. He doesn't type anymore, except on his mobile phone. Growing up, I saw firsthand how typing was supposed to be done - when it's directly on paper and 'backspace' is not an option - on a heavy contraption called the 'typewriter'. The act had the beauty of playing a musical instrument with all fingers playing their part, where the artist played with eyes closed and mind lost in rhythm and melody.


The other skill he had was writing in something called 'shorthand'. The bosses those days gave 'dictations', which the stenographer noted very quickly in this coded script, that was later converted to regular well typed documents in English. While typing is something everyone does now, albeit without regard to correctness of style or efficiency of it, shorthand definitely is a skill that's non-existent today, if not lost forever. In the late 90s, all organizations were on a rapid computerization spree. So a few years before his retirement, my father was forced to learn working with a computer - the Windows 95/98 desktops. Since his job was drafting documents, he pretty much always worked on MS Word - the software that was as reliable in its infancy as it is now for word processing. The typewriter was replaced by relatively sleek keyboards, still QWERTY, yet with soft buttons. He beat those keys like he was used to on his heavy typewriter. But it was a treat to watch him type - extremely fast, and without using 'backspace' once. He did that till his retirement in 2004. But it was clear as the new breed of computer-literate officers started taking over, that a lot of his work would be done by the bosses themselves. Gradually, everyone knew how to type on a computer. It was not a skill any more. Everyone just did, and did, and got used to it. Life has probably come full circle, as I watch mine and older generations adopting AI based tools, learning to write prompts, doing courses, and telling each other to learn or perish. 😊

Originally posted on LinkedIn on 28 April 2026.

Monday, April 27, 2026

A truly open mind stays open for life

One of the major challenges in developing a truly rational and coherent opinion about anything, that a significant majority of people shares, is that the contemporary sources push different kinds or different versions of information to each person. It's carefully filtered, increasingly customized, and artfully moulded such that it is most engaging, pleasing, entertaining, even enchanting to the person consuming it. In a recent podcast with Trevor Noah, Ian Bremmer mentioned about this as the major constraint to bringing about any ideological or political revolution that is truly constructive and has mass support, or that's intended to take out a corrupt system which uses its power to control information - both the nature and access of it.

While we generally tout diversity of opinions as a strength, when opinions are not rooted in critical and holistic analysis of topics or issues, they become biases. And when biases achieve a level of deep-rootedness through constant reinforcement, you just can't get people to agree on anything, in spite of the vast and deep level of passion that they have, and the conviction and apparent chain of logic, often quite persuasive, with which they all seem to argue.

In high-school math, before theorems - which are scientifically derived - we are taught axioms - which are to be accepted as true. Axioms form the foundations, and theorems are built on top of these axioms and other theorems. The information structures around common people are composed of highly distorted 'axioms', and opinions based on those are therefore biased, ill-informed and misleading - theorems based on false axioms provide an unreal view of the world. And such views differ person-to-person.

Critical thinking at foundational level would help mitigate formation of these baseline assumptions. A teacher must push minds back to question basic assumptions, before helping lay blocks to build more mature ideology. A truly open mind stays open for life. A mind trained to close itself tends to search for cozy rooms to shut itself inside. It's therefore of utmost importance that students are taught - encouraged - to observe the world with open mind, while being protected from biased information structures through inculcating critical thinking very early on.

Originally posted on LinkedIn on 27 April 2026.

Sunday, April 19, 2026

"Humans in the Loop"

This snapshot from "The Diary of a CEO" podcast is symbolic of the humongous human effort being invested at training this giant machine called AI to do everything that humans do. People who are getting excited at this are seeing opportunities to help shift work to AI, and make money in the process. And most of the others feel threatened - if their work will be shifted to AI, how will they make money? And then there are these innocent workers in the pic, trying to earn their daily wage in the process, with cameras and sensors tied to their body to capture their movements and help robots learn how to go about... they are not thinking beyond, coz they never did, never had the luxury, never dared to, coz it doesn’t help anyway.

A version of "Humans in the Loop" indeed. Seen that movie?


Originally posted on LinkedIn on 18th April 2026.

Thursday, March 26, 2026

Importance of ethics in research

Research, the way I understand, is to uncover the nature of reality. Irrespective of what I believe, what I think and what seems to be, research is finding a path to establish the true shape of reality - whether it’s the physical world, sociological phenomena or the metaphysical. It has to be an honest endeavour grounded in right and correct data, robust analysis and transparent presentation of results. It needs humility to acknowledge its limitations - known and unknown. Our understanding and explanations of the nature of reality - the theories - help us make sense of this world, yet can never claim to have fully solved the puzzle. But to get closer and closer to developing an understanding we build on prior research and add incremental blocks to construct better explanations. It’s therefore the responsibility of each and every researcher to be uncompromising in his/her data, approach, results and interpretation - so that future research finds a stronger base to build on and past studies get the honour and respect they deserve.

Originally posted on LinkedIn on 24 March 2026

How to forge better leaders?

Managers are expected to be almost perfect - clear communicators, good with people, sharp with data, and composed in how they present themselves. There is a visible checklist, and they are constantly measured against it. If they fall short, it shows quickly.

Leaders, on the other hand, seem to operate under a different lens. They can be unconventional, intense, sometimes even a bit irrational, and yet, this is often read as vision or conviction rather than a flaw. Part of this difference comes from where expectations sit. Managers are expected by others to meet a standard. Leaders, in many ways, shape the standard themselves, and the rest of us adjust to it.

This difference becomes sharper when you step outside firms and think about countries. In firms, you can choose, or at least attempt to choose, the leader you want to work with by moving to another firm. But you rarely get to choose the leader directly. In a country, you live with the leader, whatever the process that brought them there. In a democracy, sometimes the leader reflects your choice, but often they don’t. You don’t really opt out - you adapt, engage, or endure.

Which makes you wonder what exactly we are preparing people for. Much of management education seems designed to produce well-rounded, reliable managers - people who meet defined expectations. But leadership doesn’t quite emerge from checklists. It comes with ambiguity, intensity, and a willingness to push beyond what is already defined.

Can management education add a stronger ingredient of true leadership into its recipe, so that the ability to handle ambiguity and difficult situations is matched by equally strong wisdom, analytical depth, and grounded judgment? So that we see fewer leaders driven by impulse, and more who combine conviction with clarity. Work-ex seems to be a bad teacher when it comes to leadership.

Originally posted on LinkedIn on 26 March 2026

Monday, March 16, 2026

AIs and opinions

We expect that opinions based on data and facts would be reliable, although not indisputable. Since these are still "opinions" they are bound to contain giver bias that would spring from intuition, experience, judgement or quirks. It's human, and we understand the mechanism.

When it comes to AI, though, it gets troubling when we get opposite opinions from different tools. Being "tools" and driven by "computing", we are inclined to trust what they say, not as "opinions" but as some version or degree of "truth". But what if after consuming all the data, ChatGPT suggests, even encourages, that you do something, while Claude tells you that it's stupid, even suicidal, and you shouldn't do that ever. The data is the same. Are the tools acquiring "personality"? They are expected to, given the effort to mimic human faculties. But with humans, we have a way of figuring out. With AI, we have a totally different kind of quagmire to deal with that's neither unique, nor consistent, nor revealing in the way humans are. And in the background, it's being built, taught, designed, tweaked and tinkered - all by humans. And worst of all, it's not allowed to say "NO".

Originally posted on LinkedIn on 16 Mar 2026

Too much too soon? Lessons from the Wind Industry

Wind turbines became popular and underwent rapid evolution during the 1990s and early 2000s, fueled by climate change concerns that motivate...