A marvellous exploration of a research and innovation powerhouse that, even viewed from this age of innovation, surprisingly anticipated many approaches we think of as modern breakthroughs. I’ve long admired Bell Labs and feel that many of its researchers and innovations interacted with an impacting my own career. While in University, the notion of working with or at Bell Labs was the highest aspiration for top thinkers in many fields. The Idea Factory is an engaging read and showed me how limited my understanding of that institution really was.
First of all, from the 1920s to the 1980s, it was way ahead of its time as an agent of innovation. The approaches were brilliant and could be applied today, including the notion of building architecture and organization structures to encourage interdisciplinary collaboration. Breaking down “knowledge silos” was definitely countercultural in a century known for specialization.
Secondly, the sheer number of transformational inventions, including the laser, transistor, fibre optics, satellite communications, the cellular mobile network, integrated circuits and the notion of information as digital that came from a single institution is both surprising and would be impossible in today’s world. Sadly, in the modern competitive marketplace, there is likely no room for a monolithic regulated monopoly, as was AT&T, to support such a single engine of innovation and basic research.
My primary connection with Bell Labs was through computer science with innovations such as UNIX and C Programming Language. The historical context this book outlines shows how surprising this is because AT&T was, by regulatory decree, precluded from entering the computer industry. That said, it is ironic that most of the inventions of Bell Labs, collectively contrived to make telecommunications as a separate industry obsolete. Instead, as predicted as early as 1948 by the remarkable information age seer, Claude Shannon, much of the modern economy has by transformed by our current digital age of networked and pervasive computing.
Lastly, Gertner explores the culture of those who drove innovation. Often eccentric, and to outsiders perhaps impossible or unemployable individuals, had the sheer force of will and brainpower to achieve breakthroughs that others either hadn’t even considered or thought impossible. Given my own small town origins, the deliberate strategy of finding these small town prodigies to populate the largest research-oriented brain trust in the world resonated.
All too often, societies believe that they are the first to master innovation. Sometimes we should stop and consider successful strategies from the past. Far from being solely a modern preoccupation, innovation has always been a hallmark of human advancement. Yet, with no clear place for a lucrative and regulated monopoly to fund pure research, where will the fundamental research of the future originate?
The book cites John Mayo, a former Bell Labs chief,
“Bell Labs substantial innovations, account for a large fraction of the jobs in this country and around the world”
In a world driven by global markets and the quarterly thinking of Wall Street, we really do need to consider how our next leap of fundamental research will be unleashed. John Pierce, another Bell Labs chief summarized the “Bell Labs formula” in four main points:
“A technically competent management all the way to the top. Researchers who didn’t have to raise funds. Research on a topic or system could be, and was, supported for years. Research could be terminated without damning the researcher.”
Beyond learning from the wisdom of the leading research institution, where will we find the vision and resources to enable innovation on such a transformational scale? Beyond the Venture Capital and now Angel funded technology startup ecosystem, perhaps exemplars like Mike Lazaridis‘s pioneering Perimeter Institute of Theoretical Physicswill chart a course for the 21st century.
If you are in any way connected to this story, see link to event invitation at end of this post.
In August 1972, just before the start of fall classes, a new arrival was causing a stir in the Math & Computer building at University of Waterloo – a brand new Honeywell 6050 mainframe size computer running GCOS (General Comprehensive Operating Supervisor) and TSS (TimeSharing System). The arrival of this computer (which quickly got nicknamed, “HoneyBun” and eventually “The ‘Bun”) set the stage for a whole new generation of computer innovators at University of Waterloo and was the foundation for many a computer and internet innovator.
In retrospect, it was a fortuitous time to be young and engaged in computing. A fluid group of enthusiast programmers, “The Hacks” (a variant of the term “Hackers” popularized by MIT, yet not to be confused with the later “Crackers” who were all about malicious security breaches), revelled in getting these expensive machines (yet by today’s standards underpowered) to do super-human feats. The early 1970’s was the decade when software was coming into its own as a free-standing discipline, for the first time unbundled and unshackled from the underlying hardware. The phenemena of the timing of one’s birth affecting whole careers is eerily (the years are the same as my own) described by Malcolm Gladwell in his 2009 book Outliers.
The Honeywell had a whole culture of operators, SNUMBs, LLINKs, GMAP, MMEs, DRLs, Master Mode and not to mention that infamous pitcher of beer for anyone who could break its security. To do so was remarkably easy. For example, one day the system was down, as was commonplace in those days. As it happened the IBM 2741 terminals were loaded to print on the backs of a listing of the entire GCOS operating system. Without the ‘Bun to amuse us, we challenged each other to find at least one bug on a single page of this GCOS assembler listing. And, remarkably for a system reputed to be secure, each of us found at least one bug that was serious enough to be a security hole. This is pretty troubling for a computer system targeted to mission critical, military applications, including running the World Wide Command and Control System (WWMCCS – ie. the nuclear early warning and decision mechanism).
Shortly after the arrival of the Honeywell, Steve Johnson came to the Math Faculty on sabbatical from Bell Labs. The prolific creator of many iconic UNIX tools such as Yacc, he is also famous for the quote: “Using TSOis like kicking a dead whale down the beach”. I suspect that few people realize his key role in introducing Bell Labs culture to University of Waterloo so early, including B Programming Language, getchar(), putchar(), the beginnings of the notion of software portability and, of course, yacc. It is hard to underestimate the influence on a whole generation at Waterloo of the Bell Labs culture – a refreshing switch from the IBM and Computing Centre hegemony of the time.
The adoption of the high level language B, in addition to the GMAP assembler, unleashed a tremendous amount of hacker creativity, including work in languages, early networking, very early email (1973), the notion of a command and utilities world (even pre-UNIX) and some very high level abstractions, including writing an Easter date calculator in the macros embedded inside the high level editor QED.
Ultimately, Steve’s strong influence led to University of Waterloo being among the first schools worldwide to get the religion that was (and is) UNIX. As recounted in my recent post remembering the late Dennis Ritchie, first CCNG was able to get a tape directly from Ken Thompson to run UNIX in an amazing 1973. That machine is pictured below. A few years later, several of us UNIX converts commandeered, with assistance from several professors, a relatively unused PDP-11/45 on the 6th floor of the Math building. This ultimately became Math/UNIX which provided an almost production system complement to the ‘Bun on the 3rd floor. And, even the subject of several journal papers, we built file transfer, printing and job submission networked applications to connect them.
Photo Courtesy Jan Gray
So, whether you were an instigator, quiet observer or just an interested party, we’d love you to join us to commemorate the decade of creativity unleashed by the arrival of the Honeywell 050 years ago. We’ve got a weekend of events planned from August 17-19, 2012, with a special gala celebratory dinner on the 18th. We hope you can join us and do share this with friends so that we don’t miss anyone. Check out the details here at:
And, do try to scrounge around in your memories for anecdotes, photos and other things to bring this important milestone to life. Long before Twitter handles, I was rjhoward, so do include your Honeywell userID if you can recall it.
Today was a banner day for announcements involving a reset of the technology funding ecosystem in Canada.
For a long time, the slow demise of Canadian Venture Capital has concerned me deeply, putting us at an international disadvantage in regards to funding and building our next generation of innovative businesses. You may recall my 2009 post Who Killed Canadian Venture Capital? A Peculiarly Canadian Implosion? which recounts the extinction of almost all of the A round investors working in Ontario.
Since then, many of us have worked to bridge the gap by building Angel Networks, including Golden Triangle AngelNet (GTAN), where I chair the Selection process and using extreme syndication and leverage to replace a portion of the missing A rounds.
Today, the launch of Round 13 Capital revealed a new model for venture finance centred around a strong Founder Board whose members are also LPs, each with a “meaningful” investment in the fund. My decision to get involved was based both on this strongly aligned wealth of operating wisdom coupled with the clear strength of the core team.
The launch was widely covered by a range of tech savvy media, including:
To illustrate the both the differentiation of Round 13 and show the depth of founder experience, Bruce Croxon, indicated that the founders board has, measured by aggregate exit value, built over $2.5 billion of wealth in Canada. It is this kind of vision and operational experience that directly addresses the second of my three points that Canadian Venture Capital needs to solve.
It is exciting to be involved with the unfolding next generation funding ecosystem for technology companies of the future. Time will tell the ultimate outcome, but I’m certainly bullish on Round 13.
It is notable that much of the recent trend towards Social Innovation has come from people who began their careers in technology startups, in Silicon Valley or other technology clusters. Some notable examples include:
Bill Gates, partly at the instigation of Warren Buffet who added his personal fortune to that of Gates, left Microsoft, the company he built, to dedicate his life to innovative solutions to large world issues such as global health and world literacy through the Bill and Melinda Gates Foundation
Started by Paul Brainerd, Seattle-based Social Venture Partners International is innovating at the intersection of technology and venture capital, with Venture Philanthropy. Paul sold Aldus Corporation (an innovator in desktop publishing applications, including Pagemaker) to Adobe in the mid 1990s. In his mid-40’s at the time of the Adobe acquisition, he was young enough to seek a significant and active social purpose in his life.
Waterloo’s own Mike Lazaridis aims to transform our understanding of the universe itself by investing hundreds of millions of dollars into Perimeter Institute for Theoretical Physicsn and Institute for Quantum Computing, effectively innovating a new mechanism of education and discovery. Notable is that this area of investment is one that may well take years, possibly decades, to show what breakthroughs, if any, are discovered.
Whether or not always attributtable to this connection with technology entrepreneurs, increasingly Social Sector organizations are starting to become much more like the entrepreneurial startups so familiar in the world of high technology. I’ve personally witnessed some of this change, and would like to suggest, that while there remain big differences, the parallels are strengthening over time. The following concepts represent just a small sampling of the key areas of similarity:
1. Founders Versus Artists
Stories are legion of smart, brash (and even mercurial) technology company founders who transform a business sector through the sheer strength of their wills. Many of these founders are “control freaks” and might find employment in conventional jobs a difficult proposition. Venture capital and angel investors have learned to be wary of such founders, citing numerous examples of founderitis – in which uncoachable founders, in a case of “my way or the highway” would rather maintain control than bend to ideas from often more experienced mentors, board members and investors.
Such personalities also exist in the Social Sector. For example, many arts organizations are founded by bright and innovative artistic directors. And yet, many of these same organizations come unravelled by the same mercurial nature that prevents the organization from being properly governed and accountable to funders (investors). With my background on both sides of this divide, the parallels are hauntingly striking.
Since such founders strengths can also be their undoing (or that of their organization), a conscious Board level assessment of such situations is always wise.
2. Running on Empty
Notwithstanding the media coverage of a few lucky technology startups such as Facebook orGoogle, most technology startups run of little or no significant funding. Many seek to change the world with very small amounts of capital, sometimes no more than several million dollars. The recent trend towards building such small capitalization organizations is called the Lean Startup movement. The challenges inherent in their undercapitalization is often the top complaint of such startups. However, Sergy Brin, the Google co-founder has insightfully observed that “constraints breed creativity” to describe how an underfunded state has led to the discovery of innovative ways to build companies and deliver their products.
Likewise, from my experience the vast majority of charities and nonprofits complain about being undercapitalized, and the reality is that most are. It is a fact of life in the social sector. Only now are we starting to see the emergence of social ventures, which by stealing a page from underfunded technology startups are exploring new business models and ways to deliver social change, often leveraging IT or a different process to vastly reduce costs of program delivery.
3. Technology Changes Everything
We’ve seen the emergence of a world where all information is stored in digital form and people are connected, even while mobile, the role of the web and technology can’t be underestimated. Technology-based startups, because they are small and start from scratch, often approach traditional problems in very non-traditional ways. Revenue and funding models change, as do fundamental ways to organize a business or social enterprise. Social media allows ideas to spread in a viral fashion. We have already seen how organizations like Avaaz can mobilize hundreds of thousands or even millions of supporters globally for both local and international issues of social injustices and poverty. This is a direct analogue to how many people now rely on Twitter or Facebook, rather than a printed newspaper, for much of their news and information.
4. Mission Creep – or the path forward
Technology startups have come to learn that success depends on laser sharp focus, attention to detail and execution of a “pure play” strategy (ie. only do one thing well). Thatparticular discipline has time and time again proven to be effective in a sector where technology change is moving rapidly and most startups are generally considered to be underfunded.
Likewise, Social Enterprisesmust adopt similar approaches to deal with underfunding and change. Even in today’s more fluid and fast-changing environment, to avoid deadly Mission Creep, Board and management must have developed a complete Theory of Change roadmap to enable Manage to Outcomes.
NOTE: The intrusion and profusion of projects in my life, has prevented blogging for some time. As 2011 draws to a close, I thought I needed to make an effort to provide my perspective on some important milestones in my world.
I just heard that, after a long illness, Dennis Ritchie (dmr) died at home this weekend. I have no more information.
I trust there are people here who will appreciate the reach of his contributions and mourn his passing appropriately.
He was a quiet and mostly private man, but he was also my friend, colleague, and collaborator, and the world has lost a truly great mind.
Although the work of Dennis Ritchie has not been top of my mind for a number of years, Rob’s posting dredged up some pretty vivid early career memories.
As the co-creator of UNIX, along with his collaborator Ken Thompson, as well as the C Programming Language, Dennis had a huge and defining impact on my career, not to mention the entire computer industry. In short, after years as a leader in technology yet market laggard, it looks like in the end, UNIX won. Further, I was blessed with meeting Dennis on numerous occasions and, to that end, some historical narrative is in order.
Back in 1973, I got my first taste of UNIX at the University of Waterloo, serendipitously placing us among a select few who tasted UNIX, outside of Bell Labs, at such an early date. How did this come about? In 1972, Steve Johnson spent a sabbatical at University of Waterloo and brought B Programming Language (successor to BCPL and precursor to C, with all its getchar and putchar idiom) and yacc to the Honeywell 6050 running GCOS that the University’s Math Faculty Computing Facility (MFCF) had installed in the summer of 1972. Incidentally, although my first computer experience was in 1968 using APL on IBM 2741 terminals connected to an IBM 360/50 mainframe, I really cut my “hacker” teeth on “the ‘Bun” by writing many utilities (some in GMAP assembler and a few in B). But, I digress . .
Because of the many connections made by Steve Johnson at that seminal time, University of Waterloo was able to get Version 5 UNIX in 1973 before any real licensing by Western Electric and their descendents by simply asking Ken Thompson to personally make a copy on 9 track magnetic tape. My early work at Computer Communications Networks Group (CCNG) with Dr Ernie Chang attempting to build the first distributed medical database (shades of Personal Health Records and eHealth Ontario?) led me to be among the first to get access to the first Waterloo-based UNIX system.
The experience was an epiphany for me. Many things stood out at the time about how UNIX differed from Operating Systems of the day:
Compactness: As described by a fellow UNIX enthusiast at the time, Charles Forsyth, it was amazing that the entire operating system was barely 2 inches thick. This compared tot he feet of listings for GCOS or OS/360 made it a wonder of minimalistic compact elegance.
High Level Languages: The fact that almost 98% of UNIX was coded in C with very little assembler, even back in the days of relatively primitive computing power, was a major breakthrough.
Mathematical Elegance: With clear inspiration from nearby Princeton and mathematical principles, the team built software that for the day was surprisingly mathematically pure. The notion of a single “flat file” format containing only text, coupled with the powerful notion of connecting programmes via pipes made the modular shell and utility design a real joy to behold.
Extensible: Although criticized at the time for being disc- and compute-intensive and unable to do anything “real time”, UNIX proved to have longevity because of a simple, elegant and extensible design. Compare the mid-1970’s UNIX implementations supporting 16 simultaneous users, on the 16-bit DEC PDP-11/45 with 512KB (note that this is “KB” not “MB”) with today’s Windows quad-core processors that still lock out typing for users, as if prioritized schedulers had never been invented.
At Waterloo, I led a team of UNIX hackers who took over an underused PDP-11/45 and create Math/UNIX. On that system, many top computer talents of today adopted it as their own, including Dave Conroy, Charles Forsyth, Johann George, Dave Martindale, Ciaran O’Donnell, Bill Pase and many more. We developed such innovations as highly personalized security known as Access Control Lists, Named Pipes, file and printing networked connections to Honeywell 6050 and IBM mainframes and much more. Over time, the purity of UNIX Version 7 morphed into the more complex (and perhaps somewhat less elegant, as we unabashedly thought at the time) Berkeley Systems Distribution (BSD) from University of California at Berkeley. That being said, BSD added all-important networking capabilities using the then nascent TCP/IP stack, preparing UNIX to be a central force in powering the internet and web. As well, BSD added many security and usability features. My first meeting with Dennis Ritchie was in the late 1970’s when he came to speak at the U of W Mathematics Faculty Computer Science Club. Having the nicest car at the time, meant that I got to drive him around. I was pleasantly surprised at how accessible he was to a bunch of (mostly grad) students. In fact, he was a real gentleman. We all went out to a local pub in Heidelberg for the typical German fare of schnitzel, pigtails, beer and shuffleboard. I recall him really enjoying a simple time out with a bunch of passionate computer hackers. I, along with Dave Conroy and Johann George, moved on from University of Waterloo to my first software start up, Mark Williams Company, in Chicago, where I wrote the operating system and many utilities for the UNIX work alike known as Coherent. Mark Williams Company, under the visionary leadership of Robert Swartz, over the years hosted some of the top computer science talen in the world. Having previously worked with Dave Conroy on a never completed operating system (called Vesta), again the intellectual purity and elegance of UNIX beckoned to me to build Coherent as a respectful tribute to the masters at Bell Labs. Other notable luminaries who worked on Coherent are Tom Duff,Ciaran O’Donnell, Robert Welland, Roger Critchlow, Dave Levine, Norm Bartek and many more. Coherent was initially developed on the PDP-11/45 for expediency and was running in just over 10 months from inception. A great architecture and thoughtful design, meant that it was quickly ported to the Intel x86 (including the IBM PC, running multi-user on its non-segmented, maximum of 256KB of memory), Motorola 68000 and Zilog Z8001/2. The last architecture enabled Coherent to power the Commodore 900 which was for a time a hit in Europe and, in fact, used by Linus Torvolds as porting platform used in developing Linux. I got to meet Dennis several times in the context of work at Coherent. First, in January 1981 at the then fledgling UNIFORUM in San Francisco, Dennis and several others from Bell Labs came to the Mark Williams suite to talk to us and hear more about Coherent. I remember Dennis reading the interrupt handler, a particularly delicate piece of assembler code and commenting about how few instructions it took to get through the handler into the OS. Obviously, I was very pleased to hear that, as minimizing such critical sections of the code is what enhanced real time response. The second time was one of my first real lessons in the value of intellectual property. Mark Williams had taken significant measures to ensure that Coherent was a completely new creation and free of Bell Labs code. For example, Dave Conroy‘s DECUS C compiler, written totally in assembler, was used to create the Coherent C compiler (later Let’s C). Also, no UNIX source code was ever consulted or present. I recall Dennis visiting as the somewhat reluctant police inspector working with the Western Electric lawyers, under Al Arms. Essentially, he tried all sorts of documents features (like “date -u” which we subsequently implemented) and found them to be missing. After a very short time, Dennis was convinced that this was an independent creation, but I suspect that his lawyer sidekick was hoping he’d keeping trying to find evidence of copying. Ironically, almost 25 years later, in the SCO v. IBM lawsuit over the ownership of UNIX, Dennis’s visit to Mark Williams to investigate Coherent was cited as evidence that UNIX clone systems could be built. Dennis’s later posting about this meeting is covered in Groklaw. In 1984, I co-founded MKS with Alex White, Trevor Thompson, Steve Izma and later Ruth Songhurst. Although the company was supposed to build incremental desktop publishing tools, our early consulting led us into providing UNIX like tools for the fledgling IBM PC DOS operating environment (this is a charitable description of the system at the time). This led to MKS Toolkit, InterOpen and other products aimed at taking the UNIX zeitgeist mainstream. With first commercial release in 1985, this product line eventually spread to millions of users, and even continues today, surprising even me with both its longevity and reach. MKS, having endorsed POSIX and x/OPEN standards, became an open systems supplier to IBM MVS, HP MPE, Fujitsu Sure Systems, DEC VAX/VMS, Informix and SUN Microsystems.During my later years at MKS, as the CEO, I was mainly business focussed and, hence, I tried to hide my “inner geek”. More recently, coincidentally as geekdom has progressed to a cooler and more important sense of ubiquity, I’ve “outed” my latent geek credentials. Perhaps it was because of this, that I rarely thought about UNIX and the influence that talented Bell Labs team, including Dennis Ritchie, had on my life and career. Now in the second decade of the 21st century, the world of computing has moved on to mobile, cloud, Web 2.0 and Enterprise 2.0. In the 1980’s, after repeated missed expectations that this would (at last) be the “Year of UNIX” we all became resigned to the total dominance of Windows. It was, in my view, a fatally flawed platform with poor architecture, performance and security, yet Windows seemed to meet the needs of the market at the time. After decades of suffering through the “three finger salute” (Ctrl-ALT-DEL) and waiting endlessly for that hourglass (now a spinning circle – such is progress), in the irony of ironies UNIX appears on course to win the battle for market dominance. With all its variants (including Linux,BSD and QNX),UNIX now powers most of the important Mobile and other platforms such as MacOS, Android, iOS (iPhone, iPad, iPod) and even BlackberryPlaybook and BB10. Behind the scenes, UNIX largely forms the architecture and infrastructure of the modern web,cloud computing and also all of Google. I’m sure, in his modest and unassuming way, Dennis would be pleased to witness such an outcome to his pioneering work.
The Dennis Ritchie I experienced was a brilliant, yet refreshingly humble and grounded man. I know his passing will be a real loss to his family and close friends. The world needs more self-effacing superstars like him. He will be greatly missed.
I think there is no more fitting way to close this somewhat lengthy blogger’s ramble down memory lane than with a humorous YouTube pæan to Dennis Ritchie Write in C.
“How You Gonna Keep ‘Em Down On The Farm” (excerpt) by Andrew Bird
Oh, how ya gonna keep ’em down? Oh no, oh no Oh, how ya gonna keep ’em down? How ya gonna keep ’em away from Broadway? Jazzin’ around and painting the town? How ya gonna keep ’em away from harm? That’s the mystery
______________________
This week, my 18 month old Blackberry finally bit the dust. Out of this came a realization that led me to the challenge I issue at the end of this post.
Please don’t view my device failure to be a reflection on the reliability, or lack thereof, of Blackberry handsets. Rather, as a heavy user, I’ve found that the half life of my handsets is typically 18 to 24 months before things start to degrade – indeed, mobile devices do take a beating.
The obsolescence of one device is, however, a great opportunity to reflect on the age-old question: What do I acquire next? That is the subject of this posting, which focuses on the quantum changes in the mobile and smartphone market over the last couple of years.
I’ll start with a description of my smartphone usage patterns. Note that, in a later post, I plan to discuss how all this fits into a personal, multi-year odyssey toward greater mobile productivity across a range of converged devices and leveraging the cloud. Clearly, my smartphone use is just a part of that.
I’ve had Blackberry devices since the first RIM 957, and typically upgrade every year or so. I’ve watched the progression from simple push email, to pushing calendars and contacts, improved attachment support and viewing, even adding the “phone feature”. For years, the Blackberry has really focused on the core Enterprise functions of secure email, contacts and calendar and, quite frankly, delivered a seamless solution that just works, is secure and fast. It is for that reason that, up to the present day, my core, mission critical device has been a Blackberry. Over the last few years, I’ve added to that various other smartphone devices that have particular strengths, including the Nokia N95 (powered by Symbian OS), various Android devices and, my current other device, the ubiquitous Apple iPhone.
My current device usage pattern sees a Blackberry as my core device for traditional functions such as email, contacts and phone and my iPhone for for the newer, media-centric use cases of web browsing, social media, testing and using applications, and so on. Far from being rare, such carrying of two mobile devices seems to be the norm amongst many early adopters. Some even call it their “guilty secret.”
Over the recent past, I’ve seen my expectations of the mobile experience dramatically escalate. In reality, the term smartphone is a bit of a misnomer as the phonefunction is becoming just one application among many in a complex, highly functional, personal, mobile computing device. The state of the art in converged mobile devices (smartphones and, increasingly, tablets) has indeed crossed the Rubicon. I believe that this new mobile universe is as big a break with the past for the mobile industry as was the rise of the internet (particularly the web) to the older desktop computing industry. Indeed, in several markets, 2010 is the year when smartphones outsell laptops and desktops (combined).
To summarize this new palette of capabilities of this new mobile computing generation, they fall into several areas:
rich web browsing experience, typically powered by WebKit technology, which ironically was pioneered by ReqWireless (acquired by Google) right here in Waterloo. With the advent of HMTL5, many such as Google, view the browser as the new applications platform for consumer and business applications,
robust applications ecosystem, with simple AppStore function to buy, install and update. iPhone and Android are pretty solid in this regard. Blackberry’s ill fated AppWorld is an entirely different matter. For me, it was hard to find, not being on my Home Screen, application availability seemed to be (counterintuitively) dependent on the Blackberry model I was using, and also the OS memory security didn’t seem up to the applications actually working reliability. (Translation, I found that loading applications onto my Blackberry made the device slower and less reliable, so ended up removing most applications). Whatever the reasons, the iPhone AppStore has 250,000 applications with 5 billion downloads. Android Market has over 80,000 applications and Blackberry AppWorld lags signfiicantly behind this.
user friendly multi-media interface, including viewing of web, media, and images, drop & drop and stretch & pinch capabilities. So far, touch screen technologies used in both iPhone and Android seem to have won the race against competing keyboard-only or stylus-based alternatives. Personally, I believe there are still huge opportunities to innovate interfaces optimized for small screens and mobile usage, so I will remain open to the emergence of alternative and competing technologies. I’m convinced that one use case scenario doesn’t fit all.
a secure, modern & scalable operating system on which to build all of the above and to drive the future path of mobile computing. Given my heritage in the UNIX world starting in the 1970’s, it is interesting to me that all modern smartphones seem to be built around a UNIX/LINUX variant (iOS is derived from BSD UNIX and Android from Linux) which provides a proven, scalable and efficient platform for secure computing from mobiles to desktops to servers. Blackberry OS, by contrast, appears to be a victim of its long heritage, starting life less as a real operating system, but more a TCP/IP stack bundled with a Java framwork that morphed over time (it sounds reminiscient of the DOS to Windows migration, doesn’t it?). To be fair, Microsoft’s Windows Phone OS also suffers from its slavish attempt to emulate Windows metaphors on smaller, lower power devices and the translation doesn’t work well.
I want to stress an important point. This is not solely a criticism of Blackberry being slow to move to the next mobile generation. In fact, some of the original smartphone pioneers are struggling to adapt to this new world order as well. My first smart phone was the Nokia 9000 Communicator, similar to the device pictured on the left, and first launched in 1996. Until recently, Nokia with their Symbian OS Platform was the leader in global smartphone market share. Likewise, Microsoft adapted their Windows CE Pocket PC OS, also first released in 1996, for mobile computing market earlier in this decade, and that effort is now called Windows Phone, shown on the right. Both vendors just seem to have lost the playbook for success, but continue to thrive as businesses because smartphones represent a relatively small fraction of their overall businesses. However, respectively, feature phones and desktop OS and applications, are hardly likely to continue to be the growth drivers they once were.
I need to stress another point mentioned earlier. There will be competing approaches to platform, user interface, and design. While it is possible that Android could commoditize the smartphone device market in the way that Wintel commoditized the mass PC desktop and laptop marketplace, I suspect that being ubiquitous, personal and mobile, these next generation smartphones are likely to evolve into disparate usage patterns and form factors. That said, there will be certainly be signficant OS and platform consolidation as the market matures.
At last I get to my challenge. As an avowed early adopter, I have aggressively worked at productivity in a “mobile nomadic” workstyle which leverages open interfaces, use of the cloud and many different techniques. Even I am surprised by the huge enabling effect of modern hardware, communications and applications infrastructure in the mobile realm. Essentially, very few tasks remain that I am forced back to my desktop or laptop to accomplish. However, the sad fact is that the current Blackberry devices (also Nokia/Symbian and Microsoft) fail to measure up in this new world. Hence the comment about Farms and Paris. The new mobile reality is Paris.
My challenge comes in two parts:
What device should replace my current Blackberry?
Since the above article doesn’t paint a very pro Blackberry picture, what is RIM doing about this huge problem?
I should point out that I have every reason to want and hope that my next device is a Blackberry. RIM is a great company and a key economic driver for Canada and I happen to live and work in the Waterloo area. Furthermore, I know from personal experience that RIM has some of the smartest and most innovative people in their various product design groups, not to mention having gazillions of dollars that could fund any development. Rather, I would direct my comments at the Boardroom and C-Suite level, as I am baffled why they have taken so long to address the above strategic challenges which have already re-written the smartphone landscape. Remember that iPhone first shipped in Janaury 2007 and the 3G version over 2 years ago, so it’s not new news. Android is a bit slower out of the gate, but has achieved real traction, particularly in the last few quarters. And, to be clear, I’m not alone in this – see “Android Sales Overtake iPhone in the US” – which goes on to show the the majority of Blackberry users plan to upgrade to something other than Blackberry. The lack of strategic response, or the huge delays to do so, remains an astonishing misstep.
Therefore, if anyone senior from RIM is reading this, please help me to come to a different conclusion. I very much would like to continue carrying Blackberry products now and into the foreseeable future.
For other readeers, please comment with your thoughts. What device would you carry, and more importantly, why?
[NOTE: this post was written a week before today’s launch of the Blackberry 9800 Torch with OS 6. There are definitely some promising things in this design, but it remains to be seen if, indeed, this device represents the quantum leap that the new marketplace reality requires]
“Nature is by and large to be found out of doors, a location where, it cannot be argued, there are never enough comfortable chairs.” – Fran Lebowitz
I’m a believer that Location Based Services (LBS), coupled with the latest smartphones, will evolve a number of indispensible, and unexpected, killer applications.
That said, it’s pretty clear that those mission critical applications remain to be found. Essentially, the whole LBS opportunity, is a social experiment that early adopters are collaboratively helping to clarify.
It was with those thoughts in mind when I decided to start using some of the popular LBS social media applications, or should I say social games? These included FourSquare, Yelp and Gowalla.
Let me put this in context of other social media applications with which I’ve experimented. Back in 2007, I decided to try microblogging service Twitter, that was then in its infancy, I had low expectations. In fact, I expected to hate it, but mentally committed to give it a two week trial just for the purposes of self education. Over 3 years later, I’m still using it, love it and have found many applications which Twitter excels at – personal clipping service, early information and a sense of what my universe of followees is up to are among them.
FourSquare, although popular, hasn’t (yet) passed my personal usefulness test. And, I suspect most others still consider it more a game than a mission critical application. While there is an element of fun, it seems to be the sort of thing you could easily drop without much loss.
In that context, it surprises me that FourSquare recently pushed a new version (1.7.1) to my iPhone that checked my actual proximity to locations Since then, almost half of my check ins fail to pass this new proximity test, even though I was physically at the location in question. Below, I have re-posted my support request that gives more background.
But, suffice it to say, an application change that, on the surface, seemed sensible, made the application way less attractive to me. That’s doubly deadly in a space which is still finding it’s spot. I’m interested in comments on both the major issue (startups alienating early adopters) and even the specific issue.
I’m surprised the FourSquare has re-written the rules of an emerging LBS service without any notification. I am referring, of course, to the latest upgrade on my iPhone on which checkins deemed too distant from the intended location (by an undocumented and new algorithm) are suddenly deemed ineligible to accumulated points or badges. Because it is so fundamental, I’ve also decided to re-blog this as well, because it illustrates how the law of unintended consequenes can have a huge impact on a young service’s future prospect. Translation: this wasn’t a well thought out change in so many ways.
Why do I saw this? Here are just a few reasons: 1. For those of us who live in rural areas where cellular tower infrastructure is typically much more widely spaced (and often in the 850MHz band vs. the 1900 MHz band for broader coverage at lower densities), the inherent accuracy of locations reported by mobile devices is much lower. For example, at locations near to me, it is not uncommon to have the phone’s margin of error be as much as 4500 m to 6000 m. Although FourSquare doesn’t divulge their required closeness, I think it may be something like 500 m. With that in mind, it is almost by definition that most rural “check ins” will be, starting this week, flagged as ineligible. And, that’s the behaviour I’m seeing. Of course, in many instances GPS lowers this error, but it is surprising how many locations don’t have great GPS reception, such as indoors or in an automobile. 2. By changing the rules of the game on the fly, FourSquare has penalized those checking into locations that weren’t located that accurately in the first place – whether because of the reasons in #1 or because people weren’t told they had to define the location within a certain minimum delta of the actual location. For example, I suspect that people actually defined the location as they were walking toward the actual location, knowing that FourSquare didn’t care where the real actual location physically was. I find this behaviour in about 30-50% of the check ins I’m doing since the change.
FourSquare was an experiment for me, but given these new rules which appear to not have been well thought out for large swathes of geography, I’m considering shutting down my personal FourSquare use.. For something that still provides no direct utility, I really don’t want to have to go back to re-enter all locations information from scratch.
“It is sobering to reflect on the extent to which the structure of our business processes has been dictated by the limitations of the file folder.”
-Michael Hammer and James Champy, Reengineering Your Business
Recently, I unearthed a 10 year old book by Bill Gates, Business @ the Speed of Thought and took a bit of time to re-scan that 1999 book. On the first day of 2010, it seems appropriate to study technology trends to help give perspective to the future of the digital revolution.
Far from being an overtly partisan paen to Microsoft, the passion and enthusiam for change reflective both Bill Gates personality and the thinking of that era, shine through.
What is being presented is a prescription for a world, focused primarily on business, where mass adoption of networked computing unleashes a digital, knowledge-based revolution.
In the 1990’s, Information Technology (“IT”) was considered a “necessary evil” in business, being viewed largely as a cost centre, and consigned to report to the CFO with a major focus on cost control. Although we’ve made some progress in the last decade, there is still a huge need to educate all business people on how essential IT is to creating competitive advantage, mitigating risk and enabling new products and services. In essence, IT and the modern business, are inseparably intertwined. He suggests the notion of a “Digital Nervous System” as shorthand for a set of best practices for business to use. Although this short hand hasn’t really caught on, the ideas behind it remain.
And yet, it is easy to dismiss much of this early prosletyzing as impractical dreaming, a fact that the “dot com meltdown” probably exacerbated. So, how much of the 1990’s vision makes sense today?
Somewhat surprisingly, the answer is most of it. While some trends were completely unticipated by Bill Gates, and most of his peers in the 1990’s:
cloud computing, driven by virtualization, in which vast arrays of commodity computing power is outsourced and interconnected with high bandwidth network connectivity wasn’t considered at all. The 1990’s ethos was company controlled, in house, server farms.
social networking and social media weren’t even thought of. Driven by recent research into how ideas can spread and and recent breakthroughs in the science of social networks, coupled with cheap, pervasive and always connected computing, this is a real paradigm shift from the 1990’s world view.
outsourcing and offshoring, in which large companies access “clouds” of talent and hence are looser federations of people, is perhaps only foreshadowed. The power of individual contractors and smaller businesses has been signficantly enhanced, thus levelling the playing field in our modern digital economy.
many of the predictions and recommendations remain true or have gone from vision to reality:
Web 2.0 can be viewed as many pre-bubble Web 1.0 concepts finally coming to reality in the fullness of time. Whereas there used to be much talk about “bricks” versus “clicks”, the modern company is fully integrated with the Web as a part of the distribution strategy. Companies that were simply a “web veneer”, like Webvan or Pets.com are gone and long forgotten.
The Paperless Office, long envisaged as part of the digital revolution, is finally starting to arrive. While some days I personally seem to be unable to keep the paper monster under control, many companies have made great strides. For example, Gore Mutual Insurance Company where I recently joined the Board, has completed a transformation to paperless insurance making Canada’s oldest insurance company one of the first in North America to go paperless.
Disintermediation, or the death of the middleman, has accelerated recently. It could be argued that only highly differentiated specialist travel agents survive and the plight of the newspapers (when compared to the wildly successful wire services) are two great examples of this prediction coming true.
Knowledge Based Management Style has been driven to a much more open and collaborative one, largely by the force of the digital revolution. Although exceptions exist, every company needs to encourage the “bad news” to flow up to the top. The days of the hubristic CEO who stifles the “inconvenient truth” or fires the naysayer are surely numbered. When almost every business starts to look like a knowledge business, and information is power, cultural or procedural barriers to real time information flow become a serious competitive disadvantage.
Social Enterprise is being transformed by the digital revolution, and it would appear that BIll Gates was ahead of the curve in seeing this important trend. While much of government, healthcare and education remain in the 20th Century paradigm, IT has driven a remarkable change, such that the boundaries between for profit, not for profit and government enterprises has blurred signficantly. This transformation is an area of personal interest and enthusiam.
Customer Centred Business is both enabled by, and made essential, in the digital age. The importance of using technology to understand, serve and delight customers remains a key strategic advantage for businesses. Of course, there remain issues and concerns. Specifically, greater customer profiling, can lead to privacy concerns and we still are seeking the right balance. Furthermore, some first generation customers like call centres, have left businesses with their only interpersonal touchpoints being unpleasant. Anyone who has dealt with a Rogers call centre will immediately respond to an environment where siloed IT systems and nonempowered call centre employees create customer alienation. Competition and continuous technological improvement should resolve this over time.
In summary, much of the 1990’s technological vision was spot on. Obviously, it has taken far longer to bring into practice than people then suggested. However, compared to other societal changes, the march of digital technologies has been ligthening fast.
Although it is less fashionable to be a technology visionary today, I believe we still need to look ahead to our future. There remain many unsolved issues IT and, even more important, there are countless opportunities that future digital technologies will be able to deliver. Business leaders, governments and concerned citizens all need to understand and contribute to shaping a future world that will be both efficient, yet retain a good quality of life for us all. A long term perspective remains important because business investments and decisions have a surprisingly long lifespan.
Thus, the promise of the digital revolution continues, and it will be improved by thinkers who can help us to shape our desired future.
Way back in the early 1990’s, I had the pleasure to be Maplesoft’s first independent, outside Director. At the time, I agreed to join that Board and committed to invest my time based on the strength of the team and the great product opportunity. Their intellectual property was embodied in a breakthrough symbolic computation engine, spun out of University of Waterloo, that had the potential to revolutionize, through automation, many mathematical, scientific and engineering activities.
Sadly, I had the chance to experience first hand how one of the most promising Waterloo technology companies could become embroiled in, and ultimately paralyzed by, a bad case of founderitis. Put simply, otherwise intelligent founders who have launched a great business, sometimes allow ego and personal agenda to get in the way of the long term interests of the business (and ultimately impede value formation for its shareholders). Usually, this involves blindness to what the real needs of the company are, as well as valuing personal control over business success .
The startup world is littered with cases of founderitis, and sadly some of the worst cases involve university professors. It is ironic that the brightest people in the world can be so colour blind to issues outside their area of academic speciality. At Verdexus, many such experiences have taught us that backing the right team of founders as the most important investment criterion of all. It is interesting that the lessons learned at Maplesoft allowed Open Text Corporation, which shared the same founder, to more easily overcome the founderitis problem, and the value creation there since the 1990’s has been impressive.
While the media have minutely documented how years of stalemate and litigation (both threatened and actual) kept this business in a status quo, it is also notable that a new team of investors and management ultimately triumphed, leading to this very successful exit. They, especially Jim Cooper the current CEO and major investor, are to be commended for persevering, in spite of the founder situation, to turn that around and deliver value to all from a great Waterloo-based technology story.
Some of the key lessons are
Founderitis (and the even more insidious variant known as Professor Founderitis) can be hard to overcome. This clearly speaks to the length of time for the great Maplesoft opportunity to reach the level today. Too many years were locked up in battles with people who, instead of focusing on building business and market value for shareholders, including themselves, would seem to rather have control even if it meant “going down with the ship.”.
Strategic value can be achieved even in tough economic situations. Although the current investment and exit climate is probably the worst in decades, the company managed to get (assuming completion of earn-out inducements) somehwere around three times LTM revenues. That’s considered a great multiple in normal times, so for 2009 that should be cause for major celebration. In Maplesoft’s case, a long term relationship with Cybernet Systems as a business partner, made it easy for them to see the strategic value in the acquisition. Furthermore, as a distributor, the loss of Mathsoft product distribution rights heightened their awareness of the vulnerability of not owning intellectual property themselves. Hence this is really story of how a great strategic fit, and a strategic valuation, to match made sense. Such a fit is much more independent of the business cycle.
As much as the media lauds the “quick flip” startup stories, and they do happen, the reality is that building a great business takes time. Business plans never quite work out as expected, and, yes, sometimes, interpersonal issues get in the way of great business value creation. What is remarkable here is that, while undoubtedly Maplesoft lost market share during the period of stalemate to products like Wolfram Research’s Mathematica, the company has managed to forge a strong commercial market, grow market share and ultimately prevail. Clearly this is testament to a great product and a great team of people now driving the Maplesoft opportunity.
Again, congratulations to Jim, his team and the investors who believed in this opportunity. It is rewarding to see, after all these years, this great company deliver a home run.
I am extremely pleased to share today’s announcement from Gore Mutual Insurance Company that I have been appointed to their Board of Directors. I was officially appointed at the July 28, 2009 Board meeting and initially, I will serve on the Audit, Pension and Conduct Review & Governance Committees.
Because people may see this diferent from other activities I’m engaged in, I thought I would provide some perspective on what this appointment means for me personally.
Founded in 1839, this venerable Waterloo Region financial services institution is Canada’s oldest insurance company. Such a long and magnificent heritage and time scale is obviously very different from that of the technology startup scene. That said, this company is an object lesson to all in the nature of innovation in a long term business, and that intrigued me. The Gore, as it is affectionately known by most, has survived and thrived, not by resting on its legacy, but through a constant process of change and innovation, to stay ahead of the many curve balls that time throws at any business.
And yet, as a regulated financial services company, through organizations like OSFI (Ontario of the Superintendent of Financial Institutions) and with a mission and obligation to prudently pool and manage risks for their members and clients, the company always walks a much finer line than startups in regards to balancing risk and innovation. That is something that both intrigued me and impressed me at how well this company has navigated that highly tuned path, particularly in recent years. Having reviewed this history as part of my due diligence, I’m absolutely convinced that without innovation and change, the Gore would never have survived on it’s long legacy alone, and that’s a great testimonial when a company can manage that for 170 years.
A second consideration for me is the opportunity to learn more about the whole world of financial insitutions. In my career, I’ve sold enterprise software to some of the largest such companies in the world (e.g. HSBC, UBS, Nations Bank). Success with these customers entailed a detailed understanding of how to make their organizations more effective through better IT productivity and quality. The chance to drill deeper in a highly respected local firm like The Gore and to help shape their forward strategy, was not to be missed. I was warned, and certainly recognize, that the learning curve will be steep, but I’m already relishing that challenge, having just internalized 5 thick binders of briefing materials and background documentation.
Over the years, first as a CEO and more recently as a hands-on investor, I’ve internalized a very important truth — the people you work with (or invest in, or sit on a board with) must be the most important selection factor. In my due diligence, I was absolutely impressed with management. The CEO, Kevin McNeil, has assembled an extremely impressive team that has moved the Gore forward a quantum leap over the last few years. Likewise, the Board members are a diverse, smart and engaged team that complements this excellence in management. It is to be noted that they come from many industries and all had the same learning curve I did, including one from the Life Insurance industry which is a different business from general insurance. Furthermore, each and every employee I have talked to so far seems passionate, committed, yet not afraid to make the difficult suggestions to help shape the company’s execution. This is a company of surprisingly innovative people and in that respect, it isn’t really that different from the culture in a startup.
Finally, I note that 2009 is an interesting time to be joining a financial services company Board. While Property and Casualty insurers haven’t seen the exposure to the excesses of the credit crunch (ie. CDOs and such derivatives), it is clear that the public’s need for transparent and well executed corporate governance has never been more heartfelt. As I mentioned, the Gore is easily one of the best governed companies I’ve encountered.
Currently, I am appointed to serve on the Audit, Pension and Conduct Review & Governance Committees. Particularly in the case of the Audit Committee, my deep background in IT systems will help to balance the risk review and management activities in that important area.
I am happy to share more with people about this appointment over the coming weeks.
3 Aug 2012
0 Comments[Book Review]: The Idea Factory
The idea factory by Jon Gernter
Published by Penguin Press
WorldCat • LibraryThing • Google Books • BookFinder
A marvellous exploration of a research and innovation powerhouse that, even viewed from this age of innovation, surprisingly anticipated many approaches we think of as modern breakthroughs.
I’ve long admired Bell Labs and feel that many of its researchers and innovations interacted with an impacting my own career. While in University, the notion of working with or at Bell Labs was the highest aspiration for top thinkers in many fields. The Idea Factory is an engaging read and showed me how limited my understanding of that institution really was.
First of all, from the 1920s to the 1980s, it was way ahead of its time as an agent of innovation. The approaches were brilliant and could be applied today, including the notion of building architecture and organization structures to encourage interdisciplinary collaboration. Breaking down “knowledge silos” was definitely countercultural in a century known for specialization.
Secondly, the sheer number of transformational inventions, including the laser, transistor, fibre optics, satellite communications, the cellular mobile network, integrated circuits and the notion of information as digital that came from a single institution is both surprising and would be impossible in today’s world. Sadly, in the modern competitive marketplace, there is likely no room for a monolithic regulated monopoly, as was AT&T, to support such a single engine of innovation and basic research.
My primary connection with Bell Labs was through computer science with innovations such as UNIX and C Programming Language. The historical context this book outlines shows how surprising this is because AT&T was, by regulatory decree, precluded from entering the computer industry. That said, it is ironic that most of the inventions of Bell Labs, collectively contrived to make telecommunications as a separate industry obsolete. Instead, as predicted as early as 1948 by the remarkable information age seer, Claude Shannon, much of the modern economy has by transformed by our current digital age of networked and pervasive computing.
Lastly, Gertner explores the culture of those who drove innovation. Often eccentric, and to outsiders perhaps impossible or unemployable individuals, had the sheer force of will and brainpower to achieve breakthroughs that others either hadn’t even considered or thought impossible. Given my own small town origins, the deliberate strategy of finding these small town prodigies to populate the largest research-oriented brain trust in the world resonated.
All too often, societies believe that they are the first to master innovation. Sometimes we should stop and consider successful strategies from the past. Far from being solely a modern preoccupation, innovation has always been a hallmark of human advancement. Yet, with no clear place for a lucrative and regulated monopoly to fund pure research, where will the fundamental research of the future originate?
The book cites John Mayo, a former Bell Labs chief,
In a world driven by global markets and the quarterly thinking of Wall Street, we really do need to consider how our next leap of fundamental research will be unleashed. John Pierce, another Bell Labs chief summarized the “Bell Labs formula” in four main points:
Beyond learning from the wisdom of the leading research institution, where will we find the vision and resources to enable innovation on such a transformational scale? Beyond the Venture Capital and now Angel funded technology startup ecosystem, perhaps exemplars like Mike Lazaridis‘s pioneering Perimeter Institute of Theoretical Physics will chart a course for the 21st century.