AI: a paradigm-shift for software?

In his landmark book “The Structure of Scientific Revolutions”, Thomas S. Kuhn introduced the concept of the paradigm-shift, which is a fundamental change in the way that we view and approach a particular problem or field.

For there to be a shift, the new paradigm:

  • explains previously inexplicable phenomena
  • resolves inconsistencies in the old paradigm
  • fundamentally changes the way we approach the field

Today, we are witnessing another shift in the computing paradigm with AI making it easier and faster to create code, automate tasks, and more – and improving itself at an accelerating pace. Like other paradigm-shifts in computing, it is also enabled by the predictable increase in computing power (Moore’s Law). In addition, there is a ‘stacking’ or multiplier-effect of AI-oriented computing capabilities – that may prove to scale or multiply in a similar way.

Will AI-oriented computing be a true paradigm-shift, or simply a continuation of the current paradigm?

There is a striking fundamental difference from the current paradigm of humans writing code to explicitly define the steps a computer should take, to a potentially new paradigm of algorithms and models that learn and make decisions based on data and objectives.

The AI-oriented computing paradigm can explain things that were previously inexplicable and it can resolve many of the inconsistencies and inefficiencies of the current paradigm.

Perhaps most importantly, AI has the potential to enable new applications and solve problems that were previously impossible or impractical to address using traditional methods.

More broadly, a true Kuhnian paradigm-shift is NOT just ‘new technology or breakthrough’, but must include:

  • a change in the way we think about and approach computing as a whole
  • a shift in the underlying assumptions and concepts that guide our understanding

My Take

The current trajectory of AI-oriented computing is approaching a new paradigm.

AI-oriented computing could fundamentally change the way that we think-about and approach what we currently call the ‘software design / development / maintenance’ process – and towards new-thinking and approaches that are not centered on ‘the code’, and the multitude of tasks and human-efforts around it, that we see today.

I anticipate the new paradigm will be increasingly interdisciplinary – where the definition, purpose, and process of what we call ‘software’ today, will become more inclusive across disciplines and domains, with an increasingly broad span of capabilities; from the most ephemeral / esoteric personal use such as, ‘l need a one-time app composed for what I am doing today’ – to running the most sensitive and mission-critical business functions such as ‘execute according to required policies and business objectives’.


What do you think?

Will AI-oriented computing be a new paradigm, or is it simply a continuation of the current paradigm?


Web 2.0 and the Emperor’s New Clothes?

This is a Blog post written by Robb Bush on JUNE 3, 2010 that has long since disappeared from the internet.

Can someone explain why/how the various “Web 2.0” thingamajigs actually HELP business productivity?

I think they are neat-o, of course. I like things like Facebook and LinkedIn. Sometimes. But I can’t say they make me more productive or effective in my real everyday work. This is at least in comparison to what my expectations for what business productivity software and networked applications were supposed to do.

Is it the next Photoshop? The next Excel? The next advanced Planning and Optimization tool? I say emphatically no.

Almost all of these wonderful “Web 2.0 innovations” generate an explosion of content fragments that may – or may not – contribute to “useful knowledge”. Much of it is “in the now” – and eventually disappears or becomes buried.

Heres what I see:

  • Increased fragmentation of knowledge
  • Further challenge to already incredibly short attention spans (you want executives to use this?)
  • Reduction of meaningful conversation/ideas into soundbytes (particularly challenging when multilingual)
  • Over-reliance on text messaging / human-interpretation without process-support (eg Take a look at this… What do you think of… Heres an action item…)

Weren’t computers and software supposed to make us smarter, to bring us information, and not just inspire us to poke-around and click at disconnected messages?

The over reliance on linear text-messaging, rating things up/down, “likes”, the generation of excessive random tasks and discussions, and the capture and “trapping” of digital assets (more PPTs!) – seem to be further “dumbing things down” for everyone.

However, it provides the illusion that “things are happening”. That must be good right?

I consider this phenomenon in our industry another example of “the generation of artificial complexity that contributes to an illusion of productivity“.

More eyeballs, attention, and excitement are a good thing for Start-Ups selling shiny-objects to VCs and more Advertising to consumers…

But, is it REALLY good for the enterprise?

I have been tinkering with all of this for a long time, and often have been a strong evangelist from the earliest days of Web 0.1. But, the way things are going seem to be quite a step backward with over emphasis on copy-cat patterns from the recent crop of browser-based web applications. They may be fun to play with for a while, but may not equal more productivity or quantifiable business benefit.

Are we going to let the future of software get hijacked with brittle, non-scalable applications that place too much emphasis on linear discussion and voting trails?

Why not just consider the real benefit of Web 2.0 – which I consider the ‘people-connection’ – as a “add-on” or extension embedded into what Business Productivity Software already does (or is supposed to do). It should not be the “center” of a person’s activity, IMO.

Maybe some intelligence can be brought to all of these systems and provide some semantic matching and perhaps sensemaking. But, to do that – it will NOT be in a flat-browser paradigm, nor will you be endlessly clicking and poking around and reading little text fragments.

And, I don’t think we will call it Web 3.0/4.0… it should be a more “Disappearing Computer” that does not require our reliance on being glued to a web screen 24×7 tracking a multitude of disparate ‘inboxes’ and fragments.

Now that would be the real emerging technology.

Robb Bush
June 3, 2010