“Expectation is the mother of all frustration.” – Antonio Banderas
Meeting requests are an amazing invention. Pioneered, and standardized, almost 20 years ago by companies like Microsoft (as part of Outlook/Exchange), Novell (Groupwise) and Lotus (now part of IBM Lotus Notes) this innovation had great promise to automate an essential, yet completely routine, aspect of modern life.
The ascendency of meeting request usage, also rides several trends:
In the 1990s, I had an Executive Assistant who scheduled my time, acted as a “gatekeeper” and also worked on many projects. She was a master tactician who managed to keep 3 or more Type A executives productively multi-tasking. In many ways, sadly, such personal assistance is being subsumed by..
Increasing computational power means that automation of routine tasks, personalized to the needs of individuals is much more of a reality,
The mobile revolution has made meetings much more multi-modal and virtual, but also means that most executives must be productive even while being mobile nomads, and
Calendars have migrated from paper – I switched about 20 years ago – to desktop computers using Outlook and the like, and now to the ubiquitous smartphone and tablet devices. Such mobile devices are both convenient for calendars, but also frustratingly fiddly places to enter complex meeting details.
Thus, enter the humble Meeting Request which has swelled in popularity. I received my first such request from an Outlook/Exchange user around 2000 and they remained rare until perhaps the last 5-10 years. Now they seem to be everywhere.
In homage to my friend and colleague, Jim Estill, the quintessential time management guru, I ought to be cheering this time saving invention.
And, yet my enthusiasm is sorely tinged by a frustrating implementation resulting in suboptimal user experience …
Top 10 Meeting Request FAILs:
Trojan Horse: It has always seemed odd to me that a third party inviting me to a meeting could embed their own meeting information in my calendar, and yet I am unable to edit this “foreign” request that has invaded my calendar.
Split Personality: If Jennifer invites me, Randall, to a meeting, then why does my meeting title say “Meeting with Randall” instead of “Meeting with Jennifer”? Computers are designed to automate routine tasks so there is absolutely no excuse for this one.
No Annotation: I write comments in the notes fields of my calendar all the time. Why can’t I say, for example, “Joe is a bit dodgy” or “First met back in 2001”?
Duplication: Many times I receive a meeting request for a meeting that I have already carefully crafted an entry in my own calendar. Again, computers are supposed to be smart enough to figure these things out and merge them in an intelligent way.
Bad Versioning: Many times when meeting information is changed, such as time or venue, the update isn’t seamless. For example, it is common to have both the original and the updated version lingering in my calendar.
No Scheduling: Meeting requests are often used as trial balloons in trying to schedule busy people into meetings. The endless rounds of “Accept”, “Maybe” or “Decline” responses can end up being quite frustrating, especially for many person meetings. These, often fruitless, interchanges underscore the fact that meeting requests don’t automate routine scheduling. Instead, people have to resort to tools like Doodle to vote on alternatives, and then manually schedule the winning result.
Verbosity by having superfluous words in the limited real estate of the meeting subject line. E.g. pre-pending “Invitation:” or “Updated Invitation:” onto the front of a subject, effectively burying the important words. Many times they are put there to increase the impact and readability of the email subject line to ensure opening, but distract in the actual Calendar entry.
Invitations from GoogleEnterprise Apps or GMail tend to be the most arcane and ugly. Originally, I chalked this up to Google Calendar‘s relative immaturity compared to Outlook, but the brutally long notes and long subject lines continue to stand out as worst in class, almost to the point that I dread getting invited by Google users.
Lack of Anticipatory Computing: in an age where mobile devices know location, existing meetings and other personal habits, the trend to predictive intelligence could be incorporated into smarter meeting requests. For example, combining meeting requests with shared “Free/Busy” data could remove many manual scheduling steps.
No Personalization: Like my contact list, I put a fair bit of thought into crafting a calendar that is both useful now, but also provides a detailed audit trail of my business interactions. To do this, I use conventions, categories and other techniques that, sadly, cannot be injected into these un-editable meeting requests that instead reflect the third party initiator’s preferences.
Do let me know in comments if I missed any major points.
Given the power of networked computing to automate, why is there such a lack of excellence and progress in this particular area?
In fairness, I believe that part of the problem lies in the interplay between competition and the vagaries of formal industry standards. That said, this should be no excuse.
It is admirable that, unlike word processing formats, the various pioneers started to develop standards call iCalendar (and later vCalendar) around 1997 to standardize file formats (like .ical and .ics) and email server interactions. I do know the Microsoft attempted to extend the functionality with some very useful things around that time. But, for some reason, a great idea got off to a good start, but seems frozen at an almost Beta level of functionality.
To conclude, please read this post, not as a gripe, but instead as a call to action to developers to help take the humble meeting request to the next level of user experience. Any takers?
Almost four and a half years ago, I penned what some called the obituary of Blackberry (see “How You Gonna Keep ‘Em Down on the Farm”). My intentions in writing that missive were, in fact, quite the opposite. Back in 2008, a year after the first iPhone, Blackberry didn’t appear to be heeding the threat of major market disruption, let alone making a response. I thought that writing such a post might incite some action. Sadly, while I got loads of reaction from all over the world, the one missing piece was that this was singularly not registering inside the “Faraday Cage” of RIM headquarters at Philip and Columbia in Waterloo.
For many years, to continuously hone my expertise as an investor and participant in the next generation mobile ecosystem at VERDEXUS, I have maintained a “production” device and a “testing” device which allows me to sample the greatest number of new applications and platforms in my daily business and personal life. At the time of the 2008 blog, I switched from Blackberry as production device and iPhone as testing device. At that time, I promoted iPhone to production and introduced an early Android device into the testing status.
The four and one-half years since then, representing four to five mobile device generations at the rapid pace in which these are deployed, has seen a lot of innovation and change in the mobile universe. The first production version of Android occurred one month after the aforementioned post. Today, over nine releases later, Android 4.2, known as Jelly Bean, is a mature and polished mobile platform.
Mobile user experience has, as it were, come up from the Farm and we are now definitely in Paree. It’s hard to imagine how things could get much better, yet an even more exciting future in mobile will undoubtedly unfold. The pace of change has been almost mind boggling, with Android appearing to move almost twice as fast as iOS, the more proprietary Apple platform running iPhone and iPad.
As a young platform, Android has long shown promise. Being an open source operating system primarily developed by Google, but customized by various device manufacturers, not to mention the ever-meddling carriers, has been both a blessing and a curse. Initially, Android seemed “rougher around the edges” and more techie in feel than the uber-polished and legendary iPhone experience, which is produced end to end by Apple.
Conversely, the limitations of the Apple closed ecosystem approach are starting overtake the advantages. There are numerous examples. If you simply want to plug in your device via USB and load music and other files, Android shines by bypassing the need to go through iTunes. While iTuneshas its advantages, many of us simply want more control over our cross-device media file deployments. Another even more telling example is the recent debacle in which Apple turfed the tried and true Google Maps application in favour of a badly implemented and incomplete version of their own. This is but a single example of where Apple’s legendary quest for control is wearing thin.
While giving more control to mobile application developers has its challenges, it is clear that no one company, no matter how sainted, can determine, let alone sully serve, the desires and needs of the entire mobile universe.
It is a combination of this clear advantage, coupled with the incredible progress inAndroid and its handset manufacturers, that has led me to promote my newest device to production and render the formerly top-billed iPhone second tier status of my test device.
For me that device is the Samsung Galaxy Note 2, which with its 5.5″ screen is sometimes dubbed a “phablet” (ie. a combination of phone and tablet). Essentially a super-sized Galaxy S3, this phone is nimble, fast in computational processing and with speedy network connectivity. I first saw Europeans use it a few months ago, a cool and capable device, but perhaps an acquired taste for some
Perhaps it is simply my poor vision, but the large screen size is versatile and a joy to work with for all sorts of browsing, content and documents. The S Penstylus, even for those who don’t want to do handwriting or line drawings, transforms the mobile browsing experience by removing the navigation problems on many sites with menus which are small on mobile screens. Samsung has even developed an SDK around the S Pen which could create a whole new application ecosystem, assuming this next generation stylus gains sufficient market traction.
Is my recent promotion of Android to top device spot the end of my quest for mobile perfection? Absolutely not! In fact, only one week ago, I personally promised my colleague Alec Saunders, the ubitquitous and transformational new VP of Developer Relations for Research In Motion, that I will definitely give the new Blackberry BB10 devices a serious try. And, not just because “Devs, Blackberrry Is Going to Keep on Loving You”. I truly do like much of what I’m hearing about their capabilities.
Stay tuned – the mobile world is a fascinating and ever changing one.
In the world of wine, the concept of terroir describes a centuries long process in which the climate, soil, grape varieties and dedicated vintners, symbiotically develop a unique “sense of place” for a wine region. A favourite of mine, the garrulous and quintessential Californian vintner, Randall Grahm, while trying to establish the old World notion of terroir in California postulates that it is a long term proposition and can take centuries to develop.
As both a wine lover and serial tech entrepreneur, I firmly believe that building a tech cluster is similarly a very long term process. Ironically, the epicentre of tech clusters is in California. The Silicon Valley, which got its start in the 1950s remains the major cluster worldwide as “… no other place as yet has the Valley’s scale and resilience.”
Although I started my tech startup career in the US, it was in the Canada’s leading tech cluster of Waterloo where I built major companies and was one person who got that cluster started. Like Silicon Valley’s origins in Stanford University, the Waterloo cluster was initially fuelled by University of Waterloo. Over time, a combination of executive and programming talent, capital and professional services capabilites led to the current state of almost 1000 technology companies. By contrast to Silicon Valley, Waterloo is a must younger cluster, having started just over 25 years ago compared to the 60 years of Silicon Valley. It continues to mature around some key ingredients such as global strategic marketing capabilities and sufficient capital to fund on a globally competitive basis. Experienced people may well be the most important ingredient in a cluster’s maturation.
Further, I feel that all who have been fortunate to build wealth and experience in business, owe an obligation to “pay it forward” to the next generation. My own contributions include significant startup mentoring, Board and strategic roles in organizations like Communitech and Innovation Guelph, and for the last 3 years a Board role and chairing Selection Committee for the Golden Triangle AngelNet (GTAN). In just 3 years, GTAN has grown to about 150 paid accredited investor members who bring a wealth of experience to the 25 funding transactions to date. And, it goes without saying, that many of those financings might not have happened without GTAN having emerged to fill a significant funding gap as VC’s became largely extinct. Acting as a superangel to syndicate angel network deals is a tremendously labour intensive exercise, but one that I and others believe will pay off in the long term economic prosperity of our region.
I firmly believe knowledge-based companies to be the key ingredient of our future economic prosperity, so such company-building competence is mission critical for our region, province, country and globally. As globalization occurs, we see more and more regions clambering to reap the riches of the innovative, tech startup world.
To that end, at Verdexus, we have always taken a transatlantic perspective, primarily to have a more global window on building companies that can achieve world leadership in their chosen businesses. Over the years, I’ve worked with startups across the United States and Europe in the dominant clusters such as Boston, Chicago, Silicon Valley, London, Munich, Berlin, Stockholm and more. To round out my experience, over the last few years, I’ve sampled some key emerging regions by volunteering as an expert judge in places as diverse as Brussels area, Warsaw and Torino. A week ago, I had the opportunity to judge startups associated with the European Space Agency in Toulouse France as well as in Istanbul, Turkey. The latter Istanbul venue, EU Venture Forum was jointly sponsored by EUREKA (the pan-European research and development funding and coordination organization) and Europe Unlimited from Brussels. Collectively, these more than a dozen regional events ultimately feed into a pan-European venture prize in Berlin in December.
It has been very instructive to visit various clusters. This grassroots view, from the perspective of startups, reveals much in common globally but also a few surprises. Based solely on interacting with local startups, on a global perspective, it is clear that culture and experience vary greatly across various Euroopean regions. For example, I was pleasantly surprised that Warsaw had some of the smartest and most sophisticated business startups I’d seen anywhere. And, remember, they are pitching in English which is not their native language. Conversely, the cluster around Torino appeared to have a long way to go before its startups would begin to measure up globally.
Pitching in Istanbul
Similarly, the startups I saw in Istanbul were impressive. Some companies, following a model also common to the emerging markets of Central and Eastern Europe, were essentially cloning an existing business model into the 80 million strong Turkish market. More significantly others were clearly building globally strong technology startups. One pleasant surprise was that, of the eight companies that I coached the day before the forum, three had women CEOs. This was a surprise for Turkey, but sadly women-led companies remain all to rare in Canada
The calibre of engineering and basic technology talent was very impressive. That said, it was also clear that the level of support ecosystem around these startups is very limited – at least compared to what we see here in North America. One direct challenge was that in Europe companies appear to receive generous R&D funding which seems to encourage more of an engineering mentality than a market-driven one. In essence, projects stay too long as “science projects” and the culture and skills to get projects to market seem to suffer as a result. Although this is a generalization, there are many exceptions.
In the area of capital, the meltdown in Venture Capital A Round investments is about 3-4 years behind what already occurred in Canada. One particularly European challenge is that more and more of the VC funds have moved their offices and focus from regional markets to London, meaning that companies in the regions often have less direct access to capital. Conversely, the growing role of Angel Networks and Superangels to fill the gap is still in its infancy in Europe. I suspect that will change over the next two or three years. Venture funders like to either be close (1 hour travel) to their portfolio companies or, at the very least, to have a local investor who can “provide adult supervision”. Increasingly, experienced serial entrepreneurs will be called on to fill that key local role as Angels and Superangels. It is clear that the notion of Tim Draper going to Estonia and finding Skype is definitely the exception rather than the rule.
And that takes me right back to the notion of “tech terroir”. As global innovation increases, and people around the world vie to build ever stronger tech startup ecosystems, it is the dedicates entrepreneurs in the sector who magically nurture these maturing ecosystems. As one of the entrepreneurs that I coached mentioned, she wants to:
“make innovation easier in Turkey and to make life easier for entrepreneurs”
So, in addition to building a great global business, she also takes time to help move the needle of her local ecosystem forward. It’s a very encouraging sign that continues to inspire me as I engage with the new globalized world of tech startups.
This summer I took time to re-read an oft-overlooked volume that I believe to be the essential to anyone working in marketing and innovation. In this review, I’ll provide a few examples of why this book needs more attention, particularly here in Canada where we definitely need to up our game in marketing of innovation and technology.
Clayton Christensen, as Associate Professor of Business Administration at Harvard Business School, is a leading academic researcher on innovation. Yet, he still manages to provide practical and pragmatic strategies that real companies can use. And, most importantly, his theoretical groundwork is based on extensive, data intensive research over longer period of time with real companies and markets going through disruptive innovation.
The latter term is often thrown around lightly in technology company circles. A Disruptive technology (or innovation) typically has worse product performance in mainstream markets while having key features that interest fringe and merging markets. By contrast, sustaining technologies provide improved product performance (and often price) in mainstream markets.
The book covers real markets, including the various generations of disk drives starting with 14″ drives in the 1970’s to today’s 2.5″ (and smaller) drives. By studying hundreds of companies that emerged, thrived and failed over a 25 year period, some clear patterns emerge. Further examples across a broad range of markets, include he microprocessor market, the transition from cable diggers to hydraulic “backhoes”, accounting software and even the transition of industrial motor controllers from mechanical to electronic programmable models.
The key message of the book is that the playbook for normal (“sustaining”) technology innovation must be thrown away for disruptive technologies. Disruptive technologies break traditional rules in many, often counter-intuitive ways:
Financial – typically disruptive technologies are more expensive and have lower performance than existing products. This effect causes financial managers to kill many such innovations.
Marketing: the normal rule to “listen to your customers” must be thrown away – instead many educated guesses with repeated failures are the only path forward.
Organization: given the ability of normal strategies to reject disruptive innovations, such practices as heavyweight teams (which silo the team with more autonomy) and even spin-outs are the order of the day.
Entrepreneurial writings, not to mention my own experience, encourage us to celebrate failure. Beyond the power of learning by trial and error, The Innovator’s Dilemma, for the first time, provides an analytical framework as to why such failure is so critical in new markets.
One area where the book could provide more guidance is that of differentiating disruptive from sustaining technologies. Such discrimination is absolutely critical to ensure the right strategic approach to the new technology is adopted. Generally easy with the benefit of hindsight, such determination can be very tricky, and error prone, when first confronted with such new technologies.
This is a book that anyone working with products in fast moving markets needs to re-read regularly. It surprises me that, 15 years after publication, how few product marketers and senior executives appear to have benefited from the deep wisdom Christensen imparts.
A marvellous exploration of a research and innovation powerhouse that, even viewed from this age of innovation, surprisingly anticipated many approaches we think of as modern breakthroughs. I’ve long admired Bell Labs and feel that many of its researchers and innovations interacted with an impacting my own career. While in University, the notion of working with or at Bell Labs was the highest aspiration for top thinkers in many fields. The Idea Factory is an engaging read and showed me how limited my understanding of that institution really was.
First of all, from the 1920s to the 1980s, it was way ahead of its time as an agent of innovation. The approaches were brilliant and could be applied today, including the notion of building architecture and organization structures to encourage interdisciplinary collaboration. Breaking down “knowledge silos” was definitely countercultural in a century known for specialization.
Secondly, the sheer number of transformational inventions, including the laser, transistor, fibre optics, satellite communications, the cellular mobile network, integrated circuits and the notion of information as digital that came from a single institution is both surprising and would be impossible in today’s world. Sadly, in the modern competitive marketplace, there is likely no room for a monolithic regulated monopoly, as was AT&T, to support such a single engine of innovation and basic research.
My primary connection with Bell Labs was through computer science with innovations such as UNIX and C Programming Language. The historical context this book outlines shows how surprising this is because AT&T was, by regulatory decree, precluded from entering the computer industry. That said, it is ironic that most of the inventions of Bell Labs, collectively contrived to make telecommunications as a separate industry obsolete. Instead, as predicted as early as 1948 by the remarkable information age seer, Claude Shannon, much of the modern economy has by transformed by our current digital age of networked and pervasive computing.
Lastly, Gertner explores the culture of those who drove innovation. Often eccentric, and to outsiders perhaps impossible or unemployable individuals, had the sheer force of will and brainpower to achieve breakthroughs that others either hadn’t even considered or thought impossible. Given my own small town origins, the deliberate strategy of finding these small town prodigies to populate the largest research-oriented brain trust in the world resonated.
All too often, societies believe that they are the first to master innovation. Sometimes we should stop and consider successful strategies from the past. Far from being solely a modern preoccupation, innovation has always been a hallmark of human advancement. Yet, with no clear place for a lucrative and regulated monopoly to fund pure research, where will the fundamental research of the future originate?
The book cites John Mayo, a former Bell Labs chief,
“Bell Labs substantial innovations, account for a large fraction of the jobs in this country and around the world”
In a world driven by global markets and the quarterly thinking of Wall Street, we really do need to consider how our next leap of fundamental research will be unleashed. John Pierce, another Bell Labs chief summarized the “Bell Labs formula” in four main points:
“A technically competent management all the way to the top. Researchers who didn’t have to raise funds. Research on a topic or system could be, and was, supported for years. Research could be terminated without damning the researcher.”
Beyond learning from the wisdom of the leading research institution, where will we find the vision and resources to enable innovation on such a transformational scale? Beyond the Venture Capital and now Angel funded technology startup ecosystem, perhaps exemplars like Mike Lazaridis‘s pioneering Perimeter Institute of Theoretical Physicswill chart a course for the 21st century.
Today was a banner day for announcements involving a reset of the technology funding ecosystem in Canada.
For a long time, the slow demise of Canadian Venture Capital has concerned me deeply, putting us at an international disadvantage in regards to funding and building our next generation of innovative businesses. You may recall my 2009 post Who Killed Canadian Venture Capital? A Peculiarly Canadian Implosion? which recounts the extinction of almost all of the A round investors working in Ontario.
Since then, many of us have worked to bridge the gap by building Angel Networks, including Golden Triangle AngelNet (GTAN), where I chair the Selection process and using extreme syndication and leverage to replace a portion of the missing A rounds.
Today, the launch of Round 13 Capital revealed a new model for venture finance centred around a strong Founder Board whose members are also LPs, each with a “meaningful” investment in the fund. My decision to get involved was based both on this strongly aligned wealth of operating wisdom coupled with the clear strength of the core team.
The launch was widely covered by a range of tech savvy media, including:
To illustrate the both the differentiation of Round 13 and show the depth of founder experience, Bruce Croxon, indicated that the founders board has, measured by aggregate exit value, built over $2.5 billion of wealth in Canada. It is this kind of vision and operational experience that directly addresses the second of my three points that Canadian Venture Capital needs to solve.
It is exciting to be involved with the unfolding next generation funding ecosystem for technology companies of the future. Time will tell the ultimate outcome, but I’m certainly bullish on Round 13.
NOTE: The intrusion and profusion of projects in my life, has prevented blogging for some time. As 2011 draws to a close, I thought I needed to make an effort to provide my perspective on some important milestones in my world.
I just heard that, after a long illness, Dennis Ritchie (dmr) died at home this weekend. I have no more information.
I trust there are people here who will appreciate the reach of his contributions and mourn his passing appropriately.
He was a quiet and mostly private man, but he was also my friend, colleague, and collaborator, and the world has lost a truly great mind.
Although the work of Dennis Ritchie has not been top of my mind for a number of years, Rob’s posting dredged up some pretty vivid early career memories.
As the co-creator of UNIX, along with his collaborator Ken Thompson, as well as the C Programming Language, Dennis had a huge and defining impact on my career, not to mention the entire computer industry. In short, after years as a leader in technology yet market laggard, it looks like in the end, UNIX won. Further, I was blessed with meeting Dennis on numerous occasions and, to that end, some historical narrative is in order.
Back in 1973, I got my first taste of UNIX at the University of Waterloo, serendipitously placing us among a select few who tasted UNIX, outside of Bell Labs, at such an early date. How did this come about? In 1972, Steve Johnson spent a sabbatical at University of Waterloo and brought B Programming Language (successor to BCPL and precursor to C, with all its getchar and putchar idiom) and yacc to the Honeywell 6050 running GCOS that the University’s Math Faculty Computing Facility (MFCF) had installed in the summer of 1972. Incidentally, although my first computer experience was in 1968 using APL on IBM 2741 terminals connected to an IBM 360/50 mainframe, I really cut my “hacker” teeth on “the ‘Bun” by writing many utilities (some in GMAP assembler and a few in B). But, I digress . .
Because of the many connections made by Steve Johnson at that seminal time, University of Waterloo was able to get Version 5 UNIX in 1973 before any real licensing by Western Electric and their descendents by simply asking Ken Thompson to personally make a copy on 9 track magnetic tape. My early work at Computer Communications Networks Group (CCNG) with Dr Ernie Chang attempting to build the first distributed medical database (shades of Personal Health Records and eHealth Ontario?) led me to be among the first to get access to the first Waterloo-based UNIX system.
The experience was an epiphany for me. Many things stood out at the time about how UNIX differed from Operating Systems of the day:
Compactness: As described by a fellow UNIX enthusiast at the time, Charles Forsyth, it was amazing that the entire operating system was barely 2 inches thick. This compared tot he feet of listings for GCOS or OS/360 made it a wonder of minimalistic compact elegance.
High Level Languages: The fact that almost 98% of UNIX was coded in C with very little assembler, even back in the days of relatively primitive computing power, was a major breakthrough.
Mathematical Elegance: With clear inspiration from nearby Princeton and mathematical principles, the team built software that for the day was surprisingly mathematically pure. The notion of a single “flat file” format containing only text, coupled with the powerful notion of connecting programmes via pipes made the modular shell and utility design a real joy to behold.
Extensible: Although criticized at the time for being disc- and compute-intensive and unable to do anything “real time”, UNIX proved to have longevity because of a simple, elegant and extensible design. Compare the mid-1970’s UNIX implementations supporting 16 simultaneous users, on the 16-bit DEC PDP-11/45 with 512KB (note that this is “KB” not “MB”) with today’s Windows quad-core processors that still lock out typing for users, as if prioritized schedulers had never been invented.
At Waterloo, I led a team of UNIX hackers who took over an underused PDP-11/45 and create Math/UNIX. On that system, many top computer talents of today adopted it as their own, including Dave Conroy, Charles Forsyth, Johann George, Dave Martindale, Ciaran O’Donnell, Bill Pase and many more. We developed such innovations as highly personalized security known as Access Control Lists, Named Pipes, file and printing networked connections to Honeywell 6050 and IBM mainframes and much more. Over time, the purity of UNIX Version 7 morphed into the more complex (and perhaps somewhat less elegant, as we unabashedly thought at the time) Berkeley Systems Distribution (BSD) from University of California at Berkeley. That being said, BSD added all-important networking capabilities using the then nascent TCP/IP stack, preparing UNIX to be a central force in powering the internet and web. As well, BSD added many security and usability features. My first meeting with Dennis Ritchie was in the late 1970’s when he came to speak at the U of W Mathematics Faculty Computer Science Club. Having the nicest car at the time, meant that I got to drive him around. I was pleasantly surprised at how accessible he was to a bunch of (mostly grad) students. In fact, he was a real gentleman. We all went out to a local pub in Heidelberg for the typical German fare of schnitzel, pigtails, beer and shuffleboard. I recall him really enjoying a simple time out with a bunch of passionate computer hackers. I, along with Dave Conroy and Johann George, moved on from University of Waterloo to my first software start up, Mark Williams Company, in Chicago, where I wrote the operating system and many utilities for the UNIX work alike known as Coherent. Mark Williams Company, under the visionary leadership of Robert Swartz, over the years hosted some of the top computer science talen in the world. Having previously worked with Dave Conroy on a never completed operating system (called Vesta), again the intellectual purity and elegance of UNIX beckoned to me to build Coherent as a respectful tribute to the masters at Bell Labs. Other notable luminaries who worked on Coherent are Tom Duff,Ciaran O’Donnell, Robert Welland, Roger Critchlow, Dave Levine, Norm Bartek and many more. Coherent was initially developed on the PDP-11/45 for expediency and was running in just over 10 months from inception. A great architecture and thoughtful design, meant that it was quickly ported to the Intel x86 (including the IBM PC, running multi-user on its non-segmented, maximum of 256KB of memory), Motorola 68000 and Zilog Z8001/2. The last architecture enabled Coherent to power the Commodore 900 which was for a time a hit in Europe and, in fact, used by Linus Torvolds as porting platform used in developing Linux. I got to meet Dennis several times in the context of work at Coherent. First, in January 1981 at the then fledgling UNIFORUM in San Francisco, Dennis and several others from Bell Labs came to the Mark Williams suite to talk to us and hear more about Coherent. I remember Dennis reading the interrupt handler, a particularly delicate piece of assembler code and commenting about how few instructions it took to get through the handler into the OS. Obviously, I was very pleased to hear that, as minimizing such critical sections of the code is what enhanced real time response. The second time was one of my first real lessons in the value of intellectual property. Mark Williams had taken significant measures to ensure that Coherent was a completely new creation and free of Bell Labs code. For example, Dave Conroy‘s DECUS C compiler, written totally in assembler, was used to create the Coherent C compiler (later Let’s C). Also, no UNIX source code was ever consulted or present. I recall Dennis visiting as the somewhat reluctant police inspector working with the Western Electric lawyers, under Al Arms. Essentially, he tried all sorts of documents features (like “date -u” which we subsequently implemented) and found them to be missing. After a very short time, Dennis was convinced that this was an independent creation, but I suspect that his lawyer sidekick was hoping he’d keeping trying to find evidence of copying. Ironically, almost 25 years later, in the SCO v. IBM lawsuit over the ownership of UNIX, Dennis’s visit to Mark Williams to investigate Coherent was cited as evidence that UNIX clone systems could be built. Dennis’s later posting about this meeting is covered in Groklaw. In 1984, I co-founded MKS with Alex White, Trevor Thompson, Steve Izma and later Ruth Songhurst. Although the company was supposed to build incremental desktop publishing tools, our early consulting led us into providing UNIX like tools for the fledgling IBM PC DOS operating environment (this is a charitable description of the system at the time). This led to MKS Toolkit, InterOpen and other products aimed at taking the UNIX zeitgeist mainstream. With first commercial release in 1985, this product line eventually spread to millions of users, and even continues today, surprising even me with both its longevity and reach. MKS, having endorsed POSIX and x/OPEN standards, became an open systems supplier to IBM MVS, HP MPE, Fujitsu Sure Systems, DEC VAX/VMS, Informix and SUN Microsystems.During my later years at MKS, as the CEO, I was mainly business focussed and, hence, I tried to hide my “inner geek”. More recently, coincidentally as geekdom has progressed to a cooler and more important sense of ubiquity, I’ve “outed” my latent geek credentials. Perhaps it was because of this, that I rarely thought about UNIX and the influence that talented Bell Labs team, including Dennis Ritchie, had on my life and career. Now in the second decade of the 21st century, the world of computing has moved on to mobile, cloud, Web 2.0 and Enterprise 2.0. In the 1980’s, after repeated missed expectations that this would (at last) be the “Year of UNIX” we all became resigned to the total dominance of Windows. It was, in my view, a fatally flawed platform with poor architecture, performance and security, yet Windows seemed to meet the needs of the market at the time. After decades of suffering through the “three finger salute” (Ctrl-ALT-DEL) and waiting endlessly for that hourglass (now a spinning circle – such is progress), in the irony of ironies UNIX appears on course to win the battle for market dominance. With all its variants (including Linux,BSD and QNX),UNIX now powers most of the important Mobile and other platforms such as MacOS, Android, iOS (iPhone, iPad, iPod) and even BlackberryPlaybook and BB10. Behind the scenes, UNIX largely forms the architecture and infrastructure of the modern web,cloud computing and also all of Google. I’m sure, in his modest and unassuming way, Dennis would be pleased to witness such an outcome to his pioneering work.
The Dennis Ritchie I experienced was a brilliant, yet refreshingly humble and grounded man. I know his passing will be a real loss to his family and close friends. The world needs more self-effacing superstars like him. He will be greatly missed.
I think there is no more fitting way to close this somewhat lengthy blogger’s ramble down memory lane than with a humorous YouTube pæan to Dennis Ritchie Write in C.
“How You Gonna Keep ‘Em Down On The Farm” (excerpt) by Andrew Bird
Oh, how ya gonna keep ’em down? Oh no, oh no Oh, how ya gonna keep ’em down? How ya gonna keep ’em away from Broadway? Jazzin’ around and painting the town? How ya gonna keep ’em away from harm? That’s the mystery
______________________
This week, my 18 month old Blackberry finally bit the dust. Out of this came a realization that led me to the challenge I issue at the end of this post.
Please don’t view my device failure to be a reflection on the reliability, or lack thereof, of Blackberry handsets. Rather, as a heavy user, I’ve found that the half life of my handsets is typically 18 to 24 months before things start to degrade – indeed, mobile devices do take a beating.
The obsolescence of one device is, however, a great opportunity to reflect on the age-old question: What do I acquire next? That is the subject of this posting, which focuses on the quantum changes in the mobile and smartphone market over the last couple of years.
I’ll start with a description of my smartphone usage patterns. Note that, in a later post, I plan to discuss how all this fits into a personal, multi-year odyssey toward greater mobile productivity across a range of converged devices and leveraging the cloud. Clearly, my smartphone use is just a part of that.
I’ve had Blackberry devices since the first RIM 957, and typically upgrade every year or so. I’ve watched the progression from simple push email, to pushing calendars and contacts, improved attachment support and viewing, even adding the “phone feature”. For years, the Blackberry has really focused on the core Enterprise functions of secure email, contacts and calendar and, quite frankly, delivered a seamless solution that just works, is secure and fast. It is for that reason that, up to the present day, my core, mission critical device has been a Blackberry. Over the last few years, I’ve added to that various other smartphone devices that have particular strengths, including the Nokia N95 (powered by Symbian OS), various Android devices and, my current other device, the ubiquitous Apple iPhone.
My current device usage pattern sees a Blackberry as my core device for traditional functions such as email, contacts and phone and my iPhone for for the newer, media-centric use cases of web browsing, social media, testing and using applications, and so on. Far from being rare, such carrying of two mobile devices seems to be the norm amongst many early adopters. Some even call it their “guilty secret.”
Over the recent past, I’ve seen my expectations of the mobile experience dramatically escalate. In reality, the term smartphone is a bit of a misnomer as the phonefunction is becoming just one application among many in a complex, highly functional, personal, mobile computing device. The state of the art in converged mobile devices (smartphones and, increasingly, tablets) has indeed crossed the Rubicon. I believe that this new mobile universe is as big a break with the past for the mobile industry as was the rise of the internet (particularly the web) to the older desktop computing industry. Indeed, in several markets, 2010 is the year when smartphones outsell laptops and desktops (combined).
To summarize this new palette of capabilities of this new mobile computing generation, they fall into several areas:
rich web browsing experience, typically powered by WebKit technology, which ironically was pioneered by ReqWireless (acquired by Google) right here in Waterloo. With the advent of HMTL5, many such as Google, view the browser as the new applications platform for consumer and business applications,
robust applications ecosystem, with simple AppStore function to buy, install and update. iPhone and Android are pretty solid in this regard. Blackberry’s ill fated AppWorld is an entirely different matter. For me, it was hard to find, not being on my Home Screen, application availability seemed to be (counterintuitively) dependent on the Blackberry model I was using, and also the OS memory security didn’t seem up to the applications actually working reliability. (Translation, I found that loading applications onto my Blackberry made the device slower and less reliable, so ended up removing most applications). Whatever the reasons, the iPhone AppStore has 250,000 applications with 5 billion downloads. Android Market has over 80,000 applications and Blackberry AppWorld lags signfiicantly behind this.
user friendly multi-media interface, including viewing of web, media, and images, drop & drop and stretch & pinch capabilities. So far, touch screen technologies used in both iPhone and Android seem to have won the race against competing keyboard-only or stylus-based alternatives. Personally, I believe there are still huge opportunities to innovate interfaces optimized for small screens and mobile usage, so I will remain open to the emergence of alternative and competing technologies. I’m convinced that one use case scenario doesn’t fit all.
a secure, modern & scalable operating system on which to build all of the above and to drive the future path of mobile computing. Given my heritage in the UNIX world starting in the 1970’s, it is interesting to me that all modern smartphones seem to be built around a UNIX/LINUX variant (iOS is derived from BSD UNIX and Android from Linux) which provides a proven, scalable and efficient platform for secure computing from mobiles to desktops to servers. Blackberry OS, by contrast, appears to be a victim of its long heritage, starting life less as a real operating system, but more a TCP/IP stack bundled with a Java framwork that morphed over time (it sounds reminiscient of the DOS to Windows migration, doesn’t it?). To be fair, Microsoft’s Windows Phone OS also suffers from its slavish attempt to emulate Windows metaphors on smaller, lower power devices and the translation doesn’t work well.
I want to stress an important point. This is not solely a criticism of Blackberry being slow to move to the next mobile generation. In fact, some of the original smartphone pioneers are struggling to adapt to this new world order as well. My first smart phone was the Nokia 9000 Communicator, similar to the device pictured on the left, and first launched in 1996. Until recently, Nokia with their Symbian OS Platform was the leader in global smartphone market share. Likewise, Microsoft adapted their Windows CE Pocket PC OS, also first released in 1996, for mobile computing market earlier in this decade, and that effort is now called Windows Phone, shown on the right. Both vendors just seem to have lost the playbook for success, but continue to thrive as businesses because smartphones represent a relatively small fraction of their overall businesses. However, respectively, feature phones and desktop OS and applications, are hardly likely to continue to be the growth drivers they once were.
I need to stress another point mentioned earlier. There will be competing approaches to platform, user interface, and design. While it is possible that Android could commoditize the smartphone device market in the way that Wintel commoditized the mass PC desktop and laptop marketplace, I suspect that being ubiquitous, personal and mobile, these next generation smartphones are likely to evolve into disparate usage patterns and form factors. That said, there will be certainly be signficant OS and platform consolidation as the market matures.
At last I get to my challenge. As an avowed early adopter, I have aggressively worked at productivity in a “mobile nomadic” workstyle which leverages open interfaces, use of the cloud and many different techniques. Even I am surprised by the huge enabling effect of modern hardware, communications and applications infrastructure in the mobile realm. Essentially, very few tasks remain that I am forced back to my desktop or laptop to accomplish. However, the sad fact is that the current Blackberry devices (also Nokia/Symbian and Microsoft) fail to measure up in this new world. Hence the comment about Farms and Paris. The new mobile reality is Paris.
My challenge comes in two parts:
What device should replace my current Blackberry?
Since the above article doesn’t paint a very pro Blackberry picture, what is RIM doing about this huge problem?
I should point out that I have every reason to want and hope that my next device is a Blackberry. RIM is a great company and a key economic driver for Canada and I happen to live and work in the Waterloo area. Furthermore, I know from personal experience that RIM has some of the smartest and most innovative people in their various product design groups, not to mention having gazillions of dollars that could fund any development. Rather, I would direct my comments at the Boardroom and C-Suite level, as I am baffled why they have taken so long to address the above strategic challenges which have already re-written the smartphone landscape. Remember that iPhone first shipped in Janaury 2007 and the 3G version over 2 years ago, so it’s not new news. Android is a bit slower out of the gate, but has achieved real traction, particularly in the last few quarters. And, to be clear, I’m not alone in this – see “Android Sales Overtake iPhone in the US” – which goes on to show the the majority of Blackberry users plan to upgrade to something other than Blackberry. The lack of strategic response, or the huge delays to do so, remains an astonishing misstep.
Therefore, if anyone senior from RIM is reading this, please help me to come to a different conclusion. I very much would like to continue carrying Blackberry products now and into the foreseeable future.
For other readeers, please comment with your thoughts. What device would you carry, and more importantly, why?
[NOTE: this post was written a week before today’s launch of the Blackberry 9800 Torch with OS 6. There are definitely some promising things in this design, but it remains to be seen if, indeed, this device represents the quantum leap that the new marketplace reality requires]
“Nature is by and large to be found out of doors, a location where, it cannot be argued, there are never enough comfortable chairs.” – Fran Lebowitz
I’m a believer that Location Based Services (LBS), coupled with the latest smartphones, will evolve a number of indispensible, and unexpected, killer applications.
That said, it’s pretty clear that those mission critical applications remain to be found. Essentially, the whole LBS opportunity, is a social experiment that early adopters are collaboratively helping to clarify.
It was with those thoughts in mind when I decided to start using some of the popular LBS social media applications, or should I say social games? These included FourSquare, Yelp and Gowalla.
Let me put this in context of other social media applications with which I’ve experimented. Back in 2007, I decided to try microblogging service Twitter, that was then in its infancy, I had low expectations. In fact, I expected to hate it, but mentally committed to give it a two week trial just for the purposes of self education. Over 3 years later, I’m still using it, love it and have found many applications which Twitter excels at – personal clipping service, early information and a sense of what my universe of followees is up to are among them.
FourSquare, although popular, hasn’t (yet) passed my personal usefulness test. And, I suspect most others still consider it more a game than a mission critical application. While there is an element of fun, it seems to be the sort of thing you could easily drop without much loss.
In that context, it surprises me that FourSquare recently pushed a new version (1.7.1) to my iPhone that checked my actual proximity to locations Since then, almost half of my check ins fail to pass this new proximity test, even though I was physically at the location in question. Below, I have re-posted my support request that gives more background.
But, suffice it to say, an application change that, on the surface, seemed sensible, made the application way less attractive to me. That’s doubly deadly in a space which is still finding it’s spot. I’m interested in comments on both the major issue (startups alienating early adopters) and even the specific issue.
I’m surprised the FourSquare has re-written the rules of an emerging LBS service without any notification. I am referring, of course, to the latest upgrade on my iPhone on which checkins deemed too distant from the intended location (by an undocumented and new algorithm) are suddenly deemed ineligible to accumulated points or badges. Because it is so fundamental, I’ve also decided to re-blog this as well, because it illustrates how the law of unintended consequenes can have a huge impact on a young service’s future prospect. Translation: this wasn’t a well thought out change in so many ways.
Why do I saw this? Here are just a few reasons: 1. For those of us who live in rural areas where cellular tower infrastructure is typically much more widely spaced (and often in the 850MHz band vs. the 1900 MHz band for broader coverage at lower densities), the inherent accuracy of locations reported by mobile devices is much lower. For example, at locations near to me, it is not uncommon to have the phone’s margin of error be as much as 4500 m to 6000 m. Although FourSquare doesn’t divulge their required closeness, I think it may be something like 500 m. With that in mind, it is almost by definition that most rural “check ins” will be, starting this week, flagged as ineligible. And, that’s the behaviour I’m seeing. Of course, in many instances GPS lowers this error, but it is surprising how many locations don’t have great GPS reception, such as indoors or in an automobile. 2. By changing the rules of the game on the fly, FourSquare has penalized those checking into locations that weren’t located that accurately in the first place – whether because of the reasons in #1 or because people weren’t told they had to define the location within a certain minimum delta of the actual location. For example, I suspect that people actually defined the location as they were walking toward the actual location, knowing that FourSquare didn’t care where the real actual location physically was. I find this behaviour in about 30-50% of the check ins I’m doing since the change.
FourSquare was an experiment for me, but given these new rules which appear to not have been well thought out for large swathes of geography, I’m considering shutting down my personal FourSquare use.. For something that still provides no direct utility, I really don’t want to have to go back to re-enter all locations information from scratch.
Through the 1960’s, 1970’s and into the early 1980’s, Canada leveraged many of its best minds to develop technology solutions that span the great distances and empty spaces in our vast country to position Canada as a world leader in Telecommunications. Today, numerous examples from world leading companies like Blackberry to startups like Viigo or Iotum continue to show world leadership.
Notwithstanding these points of strength, in the early 21st Century, there are surprising gaps in our global ability to compete, given our early leadership. The causes are many from regulation, standards, finance and even investment decisions of major carrier players. While there are individual success stories, like the Blackberry, there are also numerous structural issues that dampen our natural competitive position in this all important industry.
We’ve assembled a diverse team of some of the top players shaping our mobile futureo help us understand Canada’s position in the global mobile industry, where the opportunities lie and changes in policy and investment that might allow us to maximize our footprint in the future mobile industry:
Bob Ferchat – a Canadian mobile pioneer at the epicentre of the aforementioned world class Canadian telecom and mobile industry. Bob was CEO of Nortel Networks and later of Bell Mobility. Now retired, he has maintained a passion to continue Canadian mobile leadership. Most notably, a few months ago, he led a group of investors that tried to buy back Nortel to keep that treasure trove of technology intellectual property in Canadian hands Ottawa Citizen – Fight for Nortel Wireless in Full Swing.
Karna Gupta – From an early career in various senior executive positions at Bell Canada, Karna has held numerous and diverse C-level positions in global mobile and enterprise software companies, including Comverse, Sitraka Mobile and OSS Solutions. Most recently, he was CEO of Certicom through their recent acquisition by RIM, that also included fighting a hostile takeover bid. Karna brings a great international perspective from a diverse set of predominantly software-based initiatives.
Steven Woods – Currently heading up Google’s Waterloo site, which has a significant mobile product mandate including search and GMail, Steve recently returned to Canada from a decade in the Silicon Valley. He was founder of NeoEdge Networks and co-founder of Quack.com (acquired by AOL), both Silicon Valley-Ontario operations. It would appear that Steve and his team are in the centre of the new web-based mobile world that Google is helping to shape.
There is a huge opportunity for Canadian companies, and our entire economy, but to seize that opportunity good policy and well informed decision makers is important. To that end, we’ll answer questions about the mobile tech company ecosystem like:
How did we get where we we are today?
How do we compare with the world?
What policies and initiatives might improve our competitive position?
What are some of the major gaps that Canada might be well positioned to fill?
I’d be very interested if people would comment on any topics or issues you wish to have raised. Even better, come out and ask those questions yourself. It promises to be an insightful evening.
4 Jan 2015
0 CommentsThe Downside of Meeting Requests
Meeting requests are an amazing invention. Pioneered, and standardized, almost 20 years ago by companies like Microsoft (as part of Outlook/Exchange), Novell (Groupwise) and Lotus (now part of IBM Lotus Notes) this innovation had great promise to automate an essential, yet completely routine, aspect of modern life.
The ascendency of meeting request usage, also rides several trends:
Thus, enter the humble Meeting Request which has swelled in popularity. I received my first such request from an Outlook/Exchange user around 2000 and they remained rare until perhaps the last 5-10 years. Now they seem to be everywhere.
In homage to my friend and colleague, Jim Estill, the quintessential time management guru, I ought to be cheering this time saving invention.
And, yet my enthusiasm is sorely tinged by a frustrating implementation resulting in suboptimal user experience …
Top 10 Meeting Request FAILs:
Do let me know in comments if I missed any major points.
Given the power of networked computing to automate, why is there such a lack of excellence and progress in this particular area?
In fairness, I believe that part of the problem lies in the interplay between competition and the vagaries of formal industry standards. That said, this should be no excuse.
It is admirable that, unlike word processing formats, the various pioneers started to develop standards call iCalendar (and later vCalendar) around 1997 to standardize file formats (like .ical and .ics) and email server interactions. I do know the Microsoft attempted to extend the functionality with some very useful things around that time. But, for some reason, a great idea got off to a good start, but seems frozen at an almost Beta level of functionality.
To conclude, please read this post, not as a gripe, but instead as a call to action to developers to help take the humble meeting request to the next level of user experience. Any takers?