Powers Unfiltered

An entrepreneur’s journey into grid computing and partnering with Microsoft, by John Powers

Powers Unfiltered header image 1

Interesting Digipede Win

December 10th, 2008 · 1 Comment

I’ve written here about Digipede’s financial services customers (about half of Digipede’s business is in that market), but today I’ll talk about an interesting project from the other half of our business. 

We did a press release today  about the recent sale we made to the US Navy.  You can read that here, so let’s go a little deeper. 

The Navy has access to huge volumes of very accurate geodetic data — information that tells the location and elevation of every point on earth.  (I never knew much about this area until this year, but a lot of decent public information is available.  You can look at this Wikipedia article if you’re interested, and dig around from there.)  Processing geodetic data is a very compute-intensive process.  Combining that information with 2-dimensional image data is even more complex. 

One application for this data is “terrain generation,” a process of converting raw data into a format useful for visualizing terrain in flight simulations.  A group at Lockheed Martin develops specialized software for this purpose; we’ve been working with their TOPSCENE group for some time. 

That group approached us for assistance in increasing the speed of their terrain generation process.  They’ve been great to work with — they’ve put the Digipede Network to work on their problem, and have achieved great results.  Their own press release  about the Digipede-enabled version of their software documents a 20x speedup in processing for our mutual customer at the US Navy.   

We’re excited about this application, for a variety of reasons.

First, Lockheed Martin is the largest independent software vendor (ISV) we’ve worked with, and they’ve validated our premise that ISVs would find the Digipede Network (and particularly the Digipede Framework SDK) the best choice for grid-enabling complex applications.  This is an important point for us.  ISVs can work with other vendors, or build their own application-specific distributed computing solution, and Lockheed Martin certainly has the resources to pursue either path — but they chose Digipede, and have achieved great results. 

Second, we see many, many more applications in processing geodetic data.  We’ve already made other sales in that area to government agencies outside of defense (no announcements yet, but stay tuned!), and we see increasing interest from commercial customers in this area as well.   

Finally, as many financial customers struggle with market issues (some of our clients from 2007 no longer exist in recognizable form), it’s important that Digipede diversify and demonstrate growth in other markets.  Customers like the US Navy certainly help with this important goal.  While government purchases can be  slow, we’ve been quietly working this area for years, and it’s great to see results we can discuss. 

→ 1 CommentTags: Grid applications · Growth · Press coverage

Microsoft Azure — Looks like a winner

October 27th, 2008 · No Comments

Now that the big announcements about Microsoft’s cloud computing platform are out at PDC, I can finally talk about stuff formerly under NDA.

All I can say is — wow.  The Microsoft transition from cloud-absent to cloud-giant has begun.  There have been hints and leaks and guesses for quite some time now, but the announcements today have begun tying all the pieces together and clarifying the overall strategy.  The best news — it’s clear that Azure is aimed squarely at .NET developers, and that the services provided are amazingly rich. There’s lots of work to do in explaining this multi-faceted new platform to the market, but Microsoft is off to a great start.

I’m not at PDC, but Digipede CTO Robert Anderson is; his more-thoughtful take on today’s PDC announcements is here.

More to come…

→ No CommentsTags: Cloud computing · Events · Partnering with Microsoft

Off Topic: Kiva

October 15th, 2008 · No Comments

OK, it’s “Blog Against Poverty Day,” and while I rarely participate in such blogging “days” (mostly out of disorganization), this seems a good time to share my experiences with Kiva.

This month, we are all learning (often to our dismay) just how connected our financial well-being is to decisions made by far-off strangers.  We can see now that poor decisions in New York and London can lead to financial pain around the world – and in our local communities.

But fortunately, the opposite is also true – a few good decisions locally can lead to great benefits around the world.

Early this year I heard about an organization called Kiva, which helps to organize loans for small businesses in developing countries – a practice known as “microlending,” part of the growing area of “microfinance.”  What interested me about Kiva was its funding source – individuals, recruited through its Web site.  I checked it out – and was soon lending money to store owners in Tanzania and farmers in Peru, a few dollars at a time.  I recently made my 100th loan on Kiva – many of which have already been repaid in full, and none of which have defaulted to date.

Through a combination of their growing army of individual lenders and their association with local microfinance organizations, Kiva has opened up a new way for entrepreneurs in developing countries to access the capital they need to grow their small businesses.  And those connections start here – I’ve seen other lenders on Kiva.org from my own home town of Lafayette, and from other nearby communities.

I really like what Kiva is doing – but more than that, I also really appreciate the fact that many, many organizations throughout the world are not waiting for a government bailout, are not intimidated by the scope of the problems they face, but instead are bringing innovative solutions to every corner of the globe.  As a result, while we may be connected to AIG and Lehman Brothers in ways not of our own choosing, we can also choose to connect to a store owner in Tanzania or a farmer in Peru.

→ No CommentsTags: Entrepreneurship · Uncategorized

Good Night, AdCenter

October 13th, 2008 · 3 Comments

After two years of testing the promises, wishes and hopes of the Microsoft AdCenter team, and thousands of dollars spent to no avail, Digipede is done with Microsoft’s online advertising.

I, CEO of a Microsoft Gold Certified Partner, do hereby proclaim my opinion, based on firsthand experience, that Microsoft AdCenter is entirely without value to our company, inferior in every measurable way to competing offerings from Google and even Yahoo, and a time-and-money sink of unusual scope, even for Microsoft.

I ran the campaigns myself, took advantage of consulting and optimization offers, tweaked and twiddled the knobs and dials on all three platforms, spent money on all three platforms, and Microsoft is — third.  Distant third.

I posted about AdCenter more than two years ago, and the improvements since that time have been numerous — and meaningless, from the perspective of actual business performance.  I’ve heard Gates and Ballmer and others brag about newer and better algorithms for their advertising platform more times than I can count, and I’ve seen no improvement in clickthrus from prospective customers.

Microsoft’s search engine is fine — it’s come a long way in the last few years, and is now nearly as good as Google in most ways.   But something is desperately wrong with (a) the ad placement algorithms or (b) the way those ads are displayed or (c) the audience that uses Live Search or (d) all of the above, because the right people click through to us from ads placed by Google, and they don’t from ads placed by Microsoft.

Of additional concern is the apparently defective billing mechanism, which (in my experience) continued to bill my account after all campaigns have been paused.  (OK, possibly I screwed up in some way using the less-than-intuitive AdCenter interface, which I find clumsier than its competitors, and somehow missed pausing something — although I doubt it.)  Last week I finished working through this last minor billing issue with a very helpful and friendly Microsoft representative (I’m screwed, but only out of about $80 — nothing compared to the losses from legitimate bills), and have shut down our account.

Anyone from Microsoft is free to call me about our experience with your online advertising service — I’m at 510-834-3645 ext. 101 — just so long as the call does not present me with new opportunities to use this offering.  We’re done with it.

→ 3 CommentsTags: Partnering with Microsoft · Usability

Last month’s least surprising (and least correct) cloud pronouncement

August 14th, 2008 · 2 Comments

OK, this one came out while I was on vacation, so it’s a little old, but I can’t let it pass.

The new head of Red Hat, Jim Whitehurst, says “The clouds will all run Linux.”

Really? What’s next? The head of Boeing telling us that all transportation will be via 787s? Or maybe a statement from OPEC saying that cars will all burn gasoline?

Get serious. “The Clouds” will run Linux and UNIX and Windows and OS X and whatever else paying customers want.

“The Clouds” are already running more than just Linux, and if cloud computing is going to grow beyond today’s super-early-adopter proof-of-concept market, it’s going to get more diverse, not less.

Every time there’s a new IT buzzword (oops, I mean “revolution”), some market-oblivious engineer or attention-deficit analyst declares that finally legacy computing is dead, a new paradigm is here, there’s One Right Way to do everything now, the open-source rapture is at hand, you’re free from your chains, yada yada yada.

But a guy like Jim Whitehurst should know better.  Yeah Jim, “the clouds” are going to kill Microsoft. Yeah Jim, “the clouds” mean Oracle is finished. Yeah Jim, “the clouds” will all run Linux, and Slashdot will replace all other news outlets.

I’ll have lots more constructive things to say about cloud computing soon here, but for now let me just say: the cloud computing offerings I’ve seen so far look a lot like computing. There’s hardware infrastructure, there are operating systems, there are development tools, there are applications, APIs, and user interfaces. There are administrative tools, management consoles, and buckets of kludgy tricks to make anything actually work the way you want it to work. Different vendors expose different parts of all this to their users in different ways as they struggle to differentiate. But anyone who believes that “the clouds” will “all” standardize on a single OS (or database, or programming language, or much of anything else) is just blowing smoke.

→ 2 CommentsTags: Cloud computing · Press coverage · Usability

Good article on Dan’l Lewin in (gasp) the SF Chronicle

June 27th, 2008 · No Comments

The San Francisco Chronicle is not exactly the planet’s leading source of technology news and analysis.  So many of you probably haven’t yet seen Deborah Gage’s excellent article today about Dan’l Lewin, Microsoft’s ambassador to Silicon Valley.  Dan’l is among our most important contacts (and favorite people) at Microsoft, and despite his high-visibility role, many people (including many entrepreneurs) still don’t understand the value he can bring to a startup.

In a single sentence containing at least three significant understatements, Ms. Gage writes:

Microsoft still gets criticized sometimes for being slow to the Internet or hard to do business with, but Lewin has won praise over the years for his courtesy, efficiency and ability to connect outsiders to the right people inside Microsoft, which is not an easy task.

Whew.  Let’s parse that.

“Microsoft still gets criticized sometimes for being . . . hard to do business with . . .”  There is no question that doing business with ANY huge company is hard.  Building a close relationship with Microsoft (or any tech giant) is not for the faint of heart.  Microsoft presents some special challenges that I could go on about at length (oh wait, I’ve done that multiple times…), but let’s just stipulate that some of these criticisms are justified while some are not.

“…but Lewin has won praise over the years for his courtesy, efficiency and ability to connect outsiders to the right people inside Microsoft…”   Bingo.  Dan’l Lewin has done more to expose the helpful side of Microsoft to startups, entrepreneurs, and VCs than anyone would have thought possible just a few years ago.  His Emerging Business Team is the API for startups that want hooks into Microsoft.  Digipede has received numerous tangible and intangible benefits from working with the EBT; the group brings the attitude that they can’t wait to help interesting startups, and it’s Dan’l who sets the tone and agenda for that critical group.

“…which is not an easy task.”  No kidding.  I’m back to my API analogy.  If you would rather to try to reverse-engineer the Microsoft org chart from the outside, good luck — but a single call to the EBT can get you to the right person within Microsoft faster than any other method I know.

The Bay Area is teaming with “experts” who would have us believe that Microsoft has become irrelevant.  In my experience, entrepreneurs ignore Microsoft at their peril.  Far better to understand what they’re doing and why than to pretend they aren’t there.  Dan’l and his team are great resources for entrepreneurs who want to understand and work with Microsoft.  So — good job, Ms. Gage, for profiling Microsoft’s local champion of innovation.  Well worth reading.

→ No CommentsTags: Entrepreneurship · Growth · Partnering with Microsoft · Press coverage · Startup Life

Off Topic — Solar Update

June 20th, 2008 · No Comments

As I wrote earlier this month, the Powers household is now generating some of its own electricity, via photovoltaic panels on the roof.  Soon after the installation, our local utility PG&E installed a spiffy new digital meter and certified the system for use.  I turned it on last Thursday afternoon.  Yesterday afternoon (one week later) I played around with the power inverter to see the cumulative energy production since the system was turned on.

In exactly one week of long, sunny days, the system of 18 panels produced 154 kWh — or 22 kWh per day.  In that same period (which included some nice days and some very hot days — it was over 100 degrees F at our house yesterday), the new meter tells me that our house used all that plus 151 kWh more from the grid.  So even with the pool filter running and above-normal air conditioner usage, the solar panels produced almost exactly half the electricity used by our large-ish suburban California home.  Not bad, given that we could only put solar panels on a relatively small fraction of the roof (too much shade on the rest of it).

More economic analysis to follow after I get a bill or two.

→ No CommentsTags: Uncategorized · Utility Industry

Digipede and Velocity

June 9th, 2008 · 2 Comments

The world is indeed getting smaller — and quicker, and better connected. And you might as well talk about what you’re doing, because smart well-connected people will figure everything out immediately anyway.

Case in point: Marc Adler’s recent post about Velocity, in which he says:

. . . It is no secret that the most prevelant use of object caches on Wall Street is with Grid Computing. How will Velocity interface with Compute Cluster? How about Digipede (if I know John Powers, he probably has support for Velocity already)?

Well, Marc, if you’re going to post on Sunday mornings, you may not get instant confirmation of your clever guesses, but now it’s Monday afternoon, so I can confirm — yes, we have a working PoC in the lab at Digipede, proving (to ourselves at least) that the Digipede Network and Velocity work great together. And yes, it provides significant performance improvements for certain types of activities important to Wall Street folk.

If anyone is still wondering why we chose .NET as the basis for our grid computing software, this is just the latest example — Microsoft just keeps giving us great free stuff on which to build. The Velocity CTP came out last week; this week, we have working code that provides real benefits.

Rob just posted his initial comments; there will be more.  And we’ll have feedback for the Velocity team.  Watch this space (and Rob’s, and Dan’s…).

→ 2 CommentsTags: Grid applications · Partnering with Microsoft

Grid Today connects the dots on Velocity and Digipede

June 9th, 2008 · No Comments

I can’t believe I still haven’t had a chance to write about Velocity, Microsoft’s recently announced in-memory cache.  I think  this is just further proof that I have an endless backlog of topics about which I should be writing.

In any case, Derrick Harris of Grid Today has done a great job of connecting the dots for us in his excellent article today.  Read the whole article, because he offers good insight on how important this announcement really is, but here’s his analysis of how it affects Digipede:

Whatever emerges from Velocity also should be good news to Microsoft’s technology partners — in particular Digipede, which has been delivering distributed computing to .NET apps and now might get the add-on technology it needs to compete with the big boys. Digipede has received no shortage of praise from customers and commentators alike about its relatively inexpensive and very user-friendly solution, but one of the drawbacks has been its limitation in terms of what types of jobs the Digipede Network can handle, namely CPU-intensive jobs benefitting from parallel processing. If Microsoft and Digipede can make Velocity and the Digipede Network function as a unit and keep the price down, Digipede could find itself selling to a whole new, real-time-data-loving audience. That this integration will occur is pure speculation on my part, but it seems to make sense on the surface.

I have no comment on specifics at the moment, but let’s just say — Derrick, you nailed it.

→ No CommentsTags: Grid applications · Partnering with Microsoft · Press coverage

It works in the lab — now what?

June 9th, 2008 · No Comments

Digipede CTO Robert Anderson is blogging about a recent experiment we’ve conducted in our lab, assessing what it would take to get the Digipede Agent running on Mono. (For those of you who don’t know, Mono is a cross-platform implementation of .NET, developed as an open-source project led by Miguel de Icaza, and sponsored by Novell.)

And as he reports, thanks to improvements in both Mono and the Digipede Network, the answer is — not much. We’ve got a working prototype of a Digipede Agent running under Mono on Linux that runs a Digipede job.

Digipede on Linux? Has the world turned upside down?

Hardly.

Since the beginning, Digipede has been focussed on adding value to the Microsoft platform. And customers know that. Customers also understand that Microsoft is getting better and better at making sure its products interoperate with others, even on other platforms, and Microsoft’s partners have to facilitate that. We get questions from customers pretty frequently about Mono, and lately those questions have gotten more specific, so it seems prudent to investigate any technical blockers from time to time.

So let me re-state what Robert said, and what I said above — this is an initial assessment, a technical experiment only, not a shipping product.  Rob’s post (and mine) are not a product announcement — this is a blatant “trial balloon.”  We want to hear what the market thinks of Digipede on Mono.

Why might this be interesting? Let’s back up a step and take a look at enterprise grid and HPC deployments.

Most enterprise customers have what is often called a “mixed” IT environment. That’s a euphemism for an unplanned and chaotic assortment of technologies that have piled up over the years into some type of barely-managed infrastructure. In almost every enterprise, Windows runs on most or all of the desktops. In almost every enterprise, there is some mixture of Windows and Linux servers, with maybe some Solaris and/or other UNIX flavor(s) thrown in. In almost every enterprise with an HPC infrastructure, most or all HPC nodes run Linux.

This is just reality — Windows is miles ahead in 2008 desktop market share, and Linux is miles ahead in 2008 HPC market share. Do I wish it were different? Sure — if Microsoft had a bigger share of the HPC market (and we’ve been working diligently to help make that happen), we’d have an even bigger market into which we could sell our software. And that will happen, I have no doubt. We tell all our customers “Windows HPC Server is the best option for adding power to a Digipede grid,” and that’s the truth. Go buy some now.

But the fact remains, there’s a lot of existing infrastructure — desktops, 32-bit Windows servers, Linux servers and cluster nodes, Solaris servers, and more — that enterprises are not going to throw away.  All this infrastructure represents potential grid computing power.  The Digipede Network has always run on heterogeneous Windows networks — with Agents running on 32- and 64-bit Windows desktops, 32- and 64-bit Windows servers, and cluster nodes running Windows HPC Server (formerly Compute Cluster Server). Our reluctance to include boxes that don’t run Windows has always been mostly about applications — it’s still relatively rare to find applications that are actually deployed across multiple operating systems simultaneously.

But as Mono gets better and better, we hear from enterprise customers and prospects who are getting more interested in it. They like the idea of being able to use more of their existing infrastructure more efficiently. They want to take advantage of Digipede’s great developer experience to deploy more applications — with minimal changes to that infrastructure.

So let’s get back to Robert’s closing question: “Now that we can do it, what should we do with it?” What do you think? Is the market crying out for a multi-OS .NET grid? Or is what we’re hearing just idle curiosity?  Let’s hear from all sides.

→ No CommentsTags: Compute Cluster Server · Grid applications · Partnering with Microsoft · Usability