Tech in the 603, The Granite State Hacker

Champions of Disruption

I’ve been noticing lately that truely interesting things only happen on the “edge”. Everything is energy, and everything happens at the point where energy flows are disrupted.

If you don’t believe me, just ask Mother Nature. Take solar energy. Powerful energy flows from our sun and saturates our solar system… but all the amazing things happen where that energy flow is disrupted. The Earth disrupts it, and the result, in this case, is merely life as we know it.

It’s so primal that we’ve abstracted the concept of energy flows, and call it (among other things) currency. When we sell a resource (a form of energy, in a sense), we even call that change “liquidation”.

Sure, potential energy has value, but there are no edges in a region of potential energy. Potential energy is usually static, consistent, and only really exciting for what it could do or become, rather than what it currently is.

Likewise, it’s where disruptions occur that there’s business to be done.

According to this article on Information Week, CIO/CTO’s appear to have generally become change-averse order takers. Surveys cited indicate that many shops are not actively engaged in strategy or business process innovation.

Perhaps they’re still feeling whipped by the whole “IT / Business Alignment” malignment. Maybe they’re afraid of having business process innovation through technology innovation come off as an attempt to drive the business. Ultimately, it seems many are going into survival mode, setting opportunity for change asside in favor of simply maintaining the business.

Maybe the real challenge for IT is to help business figure out that innovation is change, and change is where the action is.

In any case, it seems there’s a lot of potential energy building up out there.

The disruptions must come. Will you be a witness, a victim, or a champion of them?

Tech in the 603, The Granite State Hacker

Retail IT in the Enterprise

Lately, the projects I’ve been on have had me taking on roles outside my comfort zone. (I’m not talking about downtown-Boston… with the “Boston Express” out of Nashua, I’m ok with that.)

I’ve always been most comfortable, myself, in cross-discipline engineering roles, especially in smaller teams where everyone’s got good cross-discipline experience. The communications overhead is low. The integration friction is low. Everyone knows how it needs to be done, and people are busy building rather than negotiating aggressively.

These types of tight, focused teams have always had business focused folks who took on the role of principal consultant. In this type of situation, the principal consultant provides an insulation boundary between the technical team and the customer.

This insulation has made me comfortable in that “zone”: I’m a technologist. I eat, sleep, dream software development. I take the ability to communicate complex technical concepts with my peers effectively and concisely, very seriously.

So like I said, lately the projects I’ve been on have yanked me pretty hard out of that zone. I’ve been called on to communicate directly with my customers. I’ve been handling item-level projects, and it’s a different world. There is no insulation. I’m filling all my technical roles, plus doing light BA and even PM duty.

Somewhat recently, I emailed a solution description to a CFO. The response: “Send this again in user-level English.”

It killed me.

I’ve gotten so used to having others “protect” me from this sort of non-technical blunder. In contemporary projects, the insulating consulting roles are simply not present.

Makes me wonder about the most important lessons I learned during my school days… In high school days, maybe it was retail courtesy, and retail salesmanship in a technical atmosphere (“Radio Shack 101”). In college days, the key lessons might have been how to courteously negotiate customer experience levels, (from “help desk 101”).

Tech in the 603, The Granite State Hacker

Semi-IT / Semi-Agile

While working on-site for a client, I noticed something interesting. On the walls of some of my client’s “users” offices, along with other more classic credentials, are certifications from Microsoft… SQL Server 2005 query language certifications.

I’ve heard a lot about the lines between IT and business blurring. We talk a fair amount about it back at HQ.

Interestingly, this case is a clear mid-tier layer between classic IT (app development, data management, advanced reporting) and business in the form of ad hoc SQL querying and cube analysis. In many ways, it’s simply a “power-user” layer.

The most interesting part about it is the certification, itself. The credentials that used to qualify an IT role are now being used to qualify non-IT roles.

Another trend I’m seeing is development ceremony expectations varying depending on the risk of the project. Projects that are higher risk are expected to proceed more like a waterfall ceremony. Lower risk projects proceed with more neo-“agility”.

The project I was on was apparently considered “medium” risk. The way I saw this play out was that all of the documentation of a classic waterfall methodology was expected, but the implementation was expected to develop along with the documentation.

In many ways, it was prototyping into production. Interestingly, this project required this approach: the business users simply did not have time to approach it in a full waterfall fashion. Had we been forced into a full-fledged classic waterfall methodology, we might still be waiting to begin implementation, rather than finishing UAT.

Tech in the 603, The Granite State Hacker

WALL•E and Enterprise Data Landfills

“Life is nothing but imperfection and the computer likes perfection, so we spent probably 90% of our time putting in all of the imperfections, whether it’s in the design of something or just the unconscious stuff. “
-Andrew Stanton, director of Disney/Pixar’s WALL-E, in an interview on the topic of graphic detailing.

I’m enough of a sci-fi geek that I had to take my kids to see WALL*E the day it opened. I found it so entertaining that, while on vacation, I started browsing around the internet… digging for addititonal tidbits about the backstory.

I came across the quote, above, initially on Wikipedia’s Wall-E page.

The simple truth carries across all applications of contemporary computer technology. Technology tools are designed for the “general” cases, and yet, more and more often, we’re running into the imperfect, inconsistent, outlying, and exceptional cases.

To follow the thought along, perhaps 90% of what we do as software developers is about trying to get a grip on the complexities of… everything we get to touch on. I guess the remaining 10% would be akin to the root classes… the “Object” class, and the first few subordinates, for example.

Andrew Stanton’s quote reminds me of the 90-10 rule of software engineering… 90% of the code is implemented in 10% of the time. (conversely, the remaining 10% of the code is implemented in the remaining 90% of time). I tend to think of this one as a myth, but it’s fun thought.

It’s dealing with the rough fringes of our data that’s among the industry’s current challenges, but it’s not just corporate data landfills.

I recently heard a report that suggested that technology will get to the point that commercially available vehicles with an auto-pilot will be available within the next 20 or so years. What’s really the difference, to a computer, between financial data, and, say, navigational sensor data?

So to flip that idea on its head, again, and you could have more intelligent artificial agents spelunking through data warehouses… WALL-DW ? (Data Warehouse edition)

Then again, I wonder if the 80-20% rule isn’t what gets us into our binds to begin with.

Tech in the 603, The Granite State Hacker

Enterprise Reprogramming

I found an interview of a Satyam Senior VP listed on LinkedIn relatively interesting (link below).

This senior VP talks about how Satyam and the IT industry is responding to new challenges.

One thing that stands out to me is the statement that they are moving from services to solutions. They make the implication that they are rebuilding or reprogramming businesses at the workflow / process level. They appear to be successfully applying technology build-out as a commodity service while implementing their solutions… Sounds like they’re treating the enterprise as a sort of programmable platform, like SOA / BPM on a grand scale.

From the article:
“A solutions provider transforms business. The difference in business will happen when we change those business processes as well. That is where we are bringing in business transformation solutions — process optimisation, process reengineering, etc. “

My intuition doesn’t quite square with Satyam’s vision.

Lots of things have been pointing towards more innovation in the top layers of applications, built on a very stable technology base. To me, it still feels like there’s an unspoken motivation for that: business leadership wants IT folks to make ruggedized app dev tools and hand them over to power users (and/or process owners). Business leaders want IT to get the C# off their business processes.

All of that is sorta where I started cooking up the hypothesis of metaware from.

I’m curious to know how Satyam’s vision is really working. I guess we’ll know in a few years.

‘Moving towards IP-led revenues’

Tech in the 603, The Granite State Hacker

The Mature Software Industry, Corporate Specialization, p2

In driving down I-293 around the city of Manchester one night this weekend, I noticed some of the old factories across the river were lit up so you could see the machinery they contained. Those machines have probably been producing goods reliably for decades.

In my last post, (“Corporate Specialization“) I used an analogy of power plants to describe how software engineering groups might someday fit into the corporate landscape.

I found myself thinking that a more precise analogy would be to liken software application development to… hardware application development.

When it comes down to it, hardware, software… the end user doesn’t care. They’re all just tools to get their real jobs done.

I remember seeing this when I was a kid. I recall observing that when the Atari 2600 has a cartridge inserted, and it’s powered on, the hardware and software components were functionally indistinguishable. The complete system might as well be a dedicated-purpose hardware machine. It became an appliance.

Modern platforms don’t really change that appliance effect to the end-user.

So, aside from operators, I’m sure these classic B&M manufacturers have technical people to maintain and manage their equipment. I’d be surprised to find out that they keep a full team of mechanical engineers on the staff, though. It seems to make sense that a mature software development industry will start to look much more like that over time.

Further, take a look at computer hardware. It’s undergone some maturing over the past few decades too. There really aren’t many companies that actually bake their own. I remember tinkering a bit with chips & solder. Then, I started building PC’s from components. While my current desktop at home is still one of my own “Franken PC’s”, I think it’s been over a year since I even cracked the case on it. I suspect that when I finally break down & replace the thing, it will be 100% Dell (or maybe Sony) [and Microsoft].

With respect to software engineering, never mind all that FUD we’re hearing about IT getting sucked into business units. That’s good for “operators” and “maintenance” types, who become analytics and process management in the new world order. I suspect the heavy lifting software engineering will eventually be done mostly by companies that specialize in it.

With that, it might be more educational to look at the histories-to-date of the industrial-mechanical and electrical engineering groups to see the future of the software engineering group.

I think this might be where MS & Google are really battling it out, long term. As the software industry continues to mature, the companies with the most proven, stable software components will end up with the market advantage when it comes to building “business factories” out of them for their clients. …or maybe it will just be about the most matured development and engineering processes… or maybe both… ?

Tech in the 603, The Granite State Hacker

Corporate Specialization

There’s an old adage: “Anything you can buy, you can produce yourself for less.”

In our business, we’re well aware that there are a few fundamentally flawed assumptions with that sentiment. Despite the barriers to entry and many other recurring costs, somehow the idea seems pervasive in the business world.

I started my career in a consulting shop that insisted it was a product company. Then I moved, through market forces, into products based companies. I stayed on board with some of them long enough to help shut out the lights and unplug the servers when the sales didn’t hit the marks. The other big problem I’ve seen with product shops was that it’s engineering group typically went through its own “release cycles”… once the product was released, my team was cut to the bone.

I’ve never been in a classic IT shop, per se, but I’ve definitely been on tangent IT-support projects within “product” oriented companies. In IT groups, I’ve often thought that companies might see IT projects as frills and overhead. At some level, I think the pervasive IT-alignment is a counter measure to that idea. Still, it seems IT projects are typically viewed as liability, rather than asset. When it comes time for the company to refocus on its core competencies, (which is inevitable in the ups & downs of business) IT projects are prime targets for cutbacks.

Since the “.COM bust”, in these types of companies, an engineer on the staff for just three years is often seen as a “long-timer”… but no one feels bad about that, since a lot of these companies fold in that time frame, as well.

After experiencing the product based companys’ ups & downs, and seeing many colleagues who had already been through the IT shops, I’m convinced… the outsourced project / consulting route is really the wave of the future, even more than it has been in the past. It’s only a matter of time before business admits this necessity as well.

It makes sense… I wouldn’t figure on every company to own & operate their own power plant.

Why should they typically want their own software engineering group if they don’t have to?

[Edit: See also Part 2 Here]