Archive for category Future

Google Admits its Cars Occasionally Crash

You might look at the headline of the article and think that driverless cars are no good. In fact, the article states the exact opposite – the rate of crashes is low compared to the general populace, they were all low impact crashes, they were all caused by other drivers in other cars, and it has allowed Google to further improve their algorithms to attempt to deal with idiotic human drivers.

Just another plank in the bridge to total driverless cars – can’t come soon enough.

Google Admits its Cars Occasionally Crash

Google is busy developing self-driving cars for a number of reasons, one of which is their potential to reduce the number of accidents that occur on the roads each year. However, that doesn’t mean Google’s autonomous vehicles are immune from the odd crash here and there.

It turns out that Google’s self-driving cars have been involved in 11 accidents in the six years since the project began. Thankfully, these were all minor accidents with no injuries sustained by those involved. And considering that Google’s vehicles have covered 1.7 million miles in that time, these figures are actually rather refreshing.

Google maintains none of the accidents were the fault of the cars and their futuristic technology. Instead, all 11 accidents were caused by careless driving by people in other cars. And these incidents are now helping Google identify patterns of poor driving and adapt the software to better predict this flawed human behavior.

According to Chris Urmson, director of Google’s self-driving car program, there are 33,000 accidents on roads in the U.S. every year, and 94 percent of these are caused by human error. So, while Google still needs to get this figure of 11 down to zero, it appears the company’s autonomous vehicles are much safer than any driven by people rather than computers.

No Comments

How to do Brainstorming

PsyBlog strikes again, this time about brainstorming. As per usual, like this blog and here is what he has to say:


For many years brainstorming has been a very popular way for groups to generate new ideas, especially in business.

This is despite the fact that many studies have shown that groups actually produce fewer and less creative solutions than people working on their own. This was confusing: we are used to thinking that ‘many hands make light work’, and ‘two heads are better than one’.

The research showed, though, that many hands and heads made people nervous, lazy and blocked (for a more in-depth discussion see: Brainstorming Reloaded). In fact people perform better on their own at coming up with new ideas than in a brainstorming group.

This is highly perplexing. What we see from the creativity research is that great ideas often come from bolting together two so-so ideas. In other words: brainstorming should work.

Electronic Brainstorming

Now what’s emerging from the productivity research is that brainstorming is a good technique, but it needs a little tweaking.

Two candidates that provide a new twist on a promising formula are ‘Brainwriting’ and ‘Electronic Brainstorming’. Both use the basic brainstorming rules developed almost half a century ago by the advertising executive, Alex Faickney Osborn:

  1. Don’t criticize.
  2. Focus on quantity.
  3. Combine and improve ideas produced by others.
  4. Write down any idea that comes to mind, no matter how wild.

The pretty simple twist in Electronic Brainstorming is that it’s done online using any kind of internet chat method, like Microsoft Messenger. The only requirement is that all the participants can see the other ideas as they scroll down the screen.

Brainwriting, on the other hand, is a little more old-school and involves sitting together and writing down your ideas on Post-It notes. Participants initial their ideas and put them in the centre of the table for others to see. No talking is allowed.

A new study has compared both of these techniques and found that it is Electronic Brainstorming that produces the most non-redundant new ideas (Michinov, 2012).

The drawback of the Brainwriting method is that each person has to reach forward and pick up other ideas and people don’t do this as much as they should.

In contrast, Electronic Brainstorming allows (forces, even) every member to see what the other’s are saying with little or no effort. It means that the group is exposed to the flow of ideas with very little effort.

On top of this it solves some of the problems with face-to-face brainstorming. When it’s done online, each person doesn’t have to wait for the others to stop talking and is less worried about being evaluated (plus brainstomers don’t have to be in the same country!).

This probably helps to explain why people report finding Electronic Brainstorming to be a satisfying experience.

One final tip: Electronic Brainstorming research suggests the best results are gained in groups of 8 or more.


No Comments

Conversation as Ephemera

Now, I find this concept rather interesting:

The basic premise is that, in the past, engaging in a conversation was totally ephemeral – it was NOT recorded for posterity, unless someone specifically wanted it to be recorded.  Hence, laws regulating taking secret recordings – people have to give explicit permission to be recorded.  Many computer products want to record everything for all eternity – thereby potentially leading to embarrassing or legally difficult situations in the future (those awful pictures of one when drunk, or the inappropriate status update or comment, which are available for everyone to see forever on Facebook).  But this is not what is necessarily intended by the participants, nor desired and maybe we should be moving back to ephemeral events (conversations, statements, comments, photos, etc) and only explicitly recording some as worthy of posterity.


No Comments

10 Technology Trends for 2013

Around this time of the year (ie New Year’s), people do like predicting what is going to happen in the forthcoming year.  Apart from the obvious “game” (chortling at the predictions from years gone by which have nowhere near come true), there still is usefulness associated with both preparing as well as reviewing predictions for the future.  They do assist in focussing one’s thoughts, thinking through issues of importance, identifying directions and informing action.

So, in this spirit, Lunarpages (who are great hosting providers, in the same league as Dreamhost, also great hosting providers), offer their 10 Technology Trends for 2013.  Enjoy!

I could not just leave it with the Lunarpages futures.  Here are a couple of extras …

  • A “liquid” screen which raises real buttons/keys when required (and they disappear when not required for typing), from Tactus Technology (see TechCrunch article);


Read the rest of this entry »

No Comments

What’s in store in ’13

A few resources from a variety of sources about what? Maybe the future, maybe not!

No Comments

The Future of Computing (Now and Then) – Visualisation

Visualisation, as the name implies, is about methods or techniques of displaying (visualising) data, in such a manner that the result is further and better understanding of the underlying information or knowledge of the data, what the data is “imparting” to the audience.

Visualisation is closely aligned with Analytics in many ways, in that Visualisation is all about how to present large volume and complex data in such a manner that it is (relatively) simple for humans to understand the underlying message, or pattern, represented or conveyed by the data. Typically, the analytics process will apply algorithms to process the data in a variety of manners, hopefully obtaining a suitable result – which then needs to be presented such that decisions can be made and further action can be taken. In many instances, the analytic process itself may involve Visualisation in order to allow the people performing the analytics to determine what to do next.

These insights into the data – what it is trying to tell us – are imparted through the power of the human mind, through its ability to make connections and understandings according to visual cues (and prior knowledge). Interestingly, the ability for humans to understand based on visualisation is mainly achieved through additional computing processing – advanced visualisation typically involves advanced information processing (and sometimes advanced hardware for particular visualisation purposes, such as 3D displays, large screen projections, etc). It should be noted though, that visualisation has been around for quite some time – early maps are a form of visualisation – producing a visual representation of some data to assist in understanding (see

In the past, Visualiation has been part of Business Intelligence (think charts, graphs, dashboards), but in a rather simple and simplistic manner. Modern visualisation uses all the power of graphics and animation (including 3D) to present a compelling vision for decision making. Visualisation today is much much more than a few charts and graphs. It is many different types of graphical representations; it is animated timelines (combined with multiple graphic types); it is now 3D (both static and animated) and it is interactive (the initial visualisation can allow those viewing to select an element, which will then query for new data (for instance, more detailed information which is then visualised (possibly using a different mode of visualisation) and allows for further interaction).

Some disciplines and specialist areas only operate based on visualisation. Areas include:

Computed Axial Tomography (CAT) and Magnetic Resonance Imaging (MRI) scans in medicine only “work” because the massive amounts of data generated are presented in a visual form for the specialist physician to interpret. The physician never works with the raw data – only with a computed representation ( Astronomers now regularly use visualisation to process the huge amounts of data generated by modern telescopes (current estimates suggest that this data stream will exceed 1 Terabyte of data per day in the near future – see and to visualise and understand how the universe works – and what it looks like, from a variety of perspectives – not simply in the human visual spectrum, but also in the infrared and ultraviolet wavelengths, as well as gamma radiation and other signals. Geospatial data, such as topography, hydrography, etc, are all presented in terms of visualisation – as simple as a mapping display, up to as complicated as real-time 3D animation through a timeline.

Visualisation today is enabled by hardware advances, specifically the inclusion of Graphical Processing Units (GPUs) in computers (specifically PC based hardware) – offloading the processing for visualisation from the standard CPU onto a dedicated and high-powered chip (see; and

An excellent visualisation graphic of the different types of visualisations which can be produced, categorised into six different types (data; information; concept; strategy; metaphor and compound) has been produced by Visual Literacy (which provides e-Learning tutorials on visualisation – see and is available at The web page is interactive, displaying an example of each type of visualisation when mousing over the entry box in the table. An excellent example of how to do visualisation well!

Some of the specific techniques used for visualisation include:

  1. a Cladogram (for display of phylogeny – see;;;;;
  2. a Dendrogram (for display of classifications – see;;;
  3. Graph drawing (;;;
  4. Heat-maps (;;;;
  5. Hyper Trees (;;;;;
  6. Treemapping (;;;

In the future, such visualisation will link to differing models of Human Computer Interaction (HCI), including various forms of haptics (“The science of applying tactile sensation to human interaction with computers” – source: Also see: and immersive technologies, such as the multi-touch desktop (see – similar to the technology hypothesized in the movie The Minority Report, based on the short story of the same name by Philip K. Dick).

No Comments

Over-estimation, Under-estimation

There’s a saying in the technology industry: People tend to overestimate what will happen two years from now and underestimate what will happen in 10.

No Comments

The Future of Computing (Now) – Social Networking

Social Networking is Facebook, right?

Well, not quite.
It is a lot more than teenagers posting updates on Facebook.
It is definitely about business – and the general public – now.
A recent commentary (, as at May 2011) stated:
“With the dramatic rise in use of social media, entrepreneurs were decidedly placing themselves in one of two camps: those who saw NO potential value in social media and those who jumped on the bandwagon without a clue to where the train was heading. Game Over. Now, after a lot of experimentation and evaluation by experts and novices alike, it seems there are some real opportunities in social media for businesses of all sizes.”

Social networking is about instant communication, within an extended social setting (rather obviously). It is about stating “Your Message” – directly and instantly, to a wide group of people, rather than through some intermediary.

Historically, the intermediary is a publisher of some sort, some business which determines what is published to the wider community and what is not.


Now, one has the opportunity to build one’s own community of like minds – as large or as small as one desires (it should be noted that social networking does NOT mean that one automatically gets a “free” community. Building a community requires effort and work. If one desires a large community, one must expend substantial effort to obtain that community. The point about modern social networking is that it is possible to build such a large community with less capital outlay than was ever required previously – making the ability to reach such large communities within the hands of every individual – provided they so desire and make the requisite effort.

In many ways, this is similar to the advances in music production bought about by the digital age. Today, everyone has the capacity to produce high quality music. Not everyone does so, though, nor, for those that do, they are heard. Why not? Because (1) most people are just not interested in producing music (they would rather just listen to it); (2) many people, even though they may be technically capable of operating music studio production software, do not have the requisite talent, experience nor application to produce anything of general worth; (3) many people can and do produce excellent music, yet few hear it – simply because the steps associated with making this music available to a wider audience are not taken; and (4) even if some-one makes their music available, many will not hear it, due to the glut of content available, meaning that distribution and marketing take precedence in terms of music awareness – which returns us to the issue of social networking – another means of making a wider community aware of one’s content).

Social Networking Facilities

Facebook not the first

Most might consider that social networking began with the advent of Facebook in September 2006. Although Facebook would no doubt classify as the largest “social networking” site existing currently (as at 2011, with more than 500 million users world-wide), it was not the first such entity (indeed, Friendster (started in 2002 – now concentrating on social gaming) and MySpace (started in July 2003 – now concentrating on music and bands) pre-dated Facebook and were much more successful early on but were soon eclipsed by their successors) and is certainly not alone in terms of providing social networking facilities (other examples include Bebo (formed in 2005 Indeed, many web-based / cloud computing based applications offer collaboration / social networking capabilities as either core or adjuncts to their offerings. It appears that if a product is not “socially”-enabled then it will not sell.


Early examples of social networking were Blogging – a “en-verbened” and shortened form of the phrase “web-log” – a facility whereby one maintained a log of events / occurences / thoughts / ideas / writings / anything of interest – on the web (as a series of web pages). Specific software to allow people to create (post) and display their blog entries were soon created, with an early contender being Blogger ( – founded in August 1999, now a Google property, and also known as Blogspot), soon followed by other major blogging platforms (WordPress in 2003; Typepad on 6 August 2003; Movable Type in 2001; and Live Journal in 1999


Blogging was followed by micro-blogging – the best known example being Twitter (, formed in 2006 / 2007). The concept of a micro-blog is to keep the entry down to a minimum – in the case of Twitter, to 140 characters – the size of a SMS (Short Message Service) message from telephony minus 20 characters to allow for special addressing etc, such that a micro-blog entry could be made from a mobile phone. Twitter is not the only major micro-blogging platform (others include Status Net (formed in July 2005 and its public face, called (; and Plurk (formed in January 2008 Other products exhibit micro-blogging qualities, through their status update capabilities (such as in Facebook, but also, for instance, Google Buzz, announced on 9 February 2010, which integrates with most Google products as well as a range of external offerings).


And not to be left out, there is also the concept of mini-blogs – a cross between a full blog and a micro-blog, combining the ease and immediacy of micro-blogging with the extended entry capacity of blogging (longer text, photo’s, video’s, etc). The major examples in the mini-blogging niche include Tumblr (formed in 2007; and Posterous (formed in July 2008

Secondary Support Applications

The success of the various blogging / micro-blogging / mini-blogging platforms has led to the creation of a vibrant secondary support industry of individuals and organisations writing applications which integrate and work with the major products. As an example, there are a range of products which allow one to view all the “streams” of information to which one subscribes – one’s Facebook stream; one’s Twitter stream; one’s blogging news feed and other such sites. Such products include Hootsuite (; TweetDeck (; Stroodle ( and the Ubermedia apps (

People and Place

Social networking is not only about connecting people, but also about connecting people and place (see Location based social networking is enabled by smart-phones with both GPS and 3G/4G internet capabilities – people are always connected, no matter where they are.

Possibly the most pre-eminent site in this category is FourSquare (, allowing people to check-in to places and comment on what they are doing there, or what they experience (say, comment on a restaurant, or service from a shop, or being at a concert, etc). Google has a similar product called Google Latitude ( Other examples include Gowalla (, specifically targeted at travel and exploration; and Yelp (, which bills itself as “the fun and easy way to find and talk about great (and not so great) local businesses”. Indeed, even Facebook is now in on the act, with its Facebook Places (


Finally, don’t forget that social media is not just text – it is all types of media, most notably video. YouTube ( is the most famous – or possibly, infamous – of the video sharing sites, which not only allows anyone to upload video, but for others to comment on and share videos. The measure of popularity of an event or item can be measured by the number of Youtube “views” that a video will receive. Another well used video sharing site is Vimeo (, similar in concept to Youtube (allowing uploading and sharing of videos), but possibly more oriented towards information sharing, particularly in a business context. It headlines itself as: “Vimeo is a respectful community of creative people who are passionate about sharing the videos they make. We provide the best tools and highest quality video in the universe.”, indicating its focus on a defined community, as opposed to the “free-for-all” which could characterise YouTube.

These types of social networking are not in isolation from each other any longer. People share their Youtube videos using Facebook. They receive their entertainment (TV shows, viral video clips, information and infotainment) through Youtube, which they then share with their friends. Breaking news is now Twitter and Facebook and Youtube – the news is witnessed by, photographed or video’ed by people, on the spot, using their mobile phones, immediately uploaded and instantly available world-wide.

Another type of multi-media based social networking tool is Skype (, allowing individuals and groups to make voice and video calls across the internet, integrated with a set of contacts.

Social Networking and Business

Social networking is not limited to individuals or personal matters. As mentioned above, and throughout this article, social networking is increasingly being used as a business tool (eg for marketing purposes) and within businesses themselves. To provide a more “serious” description of social networking for business use, it is also known as “Enterprise Social Software”. From a business perspective: “Social media at its core is all about having a dialogue with your customers – it’s about people investment.” – Blake Cahill (Principle at Banyan Branch, quoted from

Blogging and wiki capabilities for business are provided by a wide variety of systems, commercial and open source – which can be installed and operated in-house or using a software-as-a-service model. Systems such as WordPress, Typepad etc can be used in this manner. Micro-blogging facilities are available from a number of sources, such as StatusNet and Yammer (formed in September 2008 which is billed as a “free private social network for your company”.

Social Networking for Professionals

And there are specific social networking sites just for professionals. The largest is LinkedIn (formed in May 2003 is where “Over 100 million professionals … exchange information, ideas and opportunities” (according to its website). XING (formed in August 2003 also bills itself as a professional business network. Both these sites are about making connections between professionals, particularly with respect to job hunting and career development, but also in terms of maintaining contact with one’s network.

And the final site is a hybrid business / individual social networking site, specifically for maintaining an address book, called Plaxo (

Social Networking and Applications

As mentioned above, many products with other tasks / functions at their core are increasingly adding collaboration and social networking / media capabilities to their offerings, since, even if not an integral component of the functionality of the product, these facilities must be provided for the product to sell. Prominent examples in the area include the myriad of project and task management products available, as well as CRM, DMS, CMS and other such suites.

Social Networking is Mainstream

As evidence of the rise of social networking, Facebook has become so ubiquitous that mainstream companies, who have nothing specific to do with ICT, are creating Facebook accounts and pages, displaying their details in their advertising – such as on business cards,in print ads and on TV. Car companies, banks, insurance companies (and increasingly, everyone else) are now using Facebook as their consumer entry “page”. One’s local car yard, one’s local plumber all have Facebook pages.

Social Networking – Today and in the Future

What is social networking and social media being used for today (and in the future)?

Some elements include:

  1. one-to-one conversation;
  2. one-to-intimate group “conversation” / “discussion” / “information dissemination”;
  3. event- notification, organisation, attendance;
  4. location based notifications, and recommendations etc;
  5. marketing, advertising and selling;
  6. SEO (Search Engine Optimisation) enablement;
  7. disseminating materials, such as brochuresm eBooks, etc;
  8. photo albums and video collections;
  9. match-making;
  10. emergency management – notifying communities and constituents during flood, fire and earthquake, etc;
  11. political advertising and commentary;
  12. political activism – dissent (and assent, if you will);
  13. automatic updating of location and connectivity, incorporating geo-mobile technologies and “internet of things” technologies;
  14. automatic understanding of one’s environment (facial recognition, for instance) and context (through natural language processing);
  15. greater decentralisation – many elements of social networking will inter-operate;
  16. search engines will embrace all social networking interactions;
  17. content aggregation (introducing a new element of electronic intermediation) will work towards sifting, analysing and presenting the mass of online data from a wide variety of sources (including all social media sources) into a useable form for the individual (and businesses);
  18. greater use of analytics – particularly big data and deep analytics, sifting through the massive amount of data generated by social networking tools and systems, to identify patterns and understand what is happening, or what is relevant to one’s area of interest (business or otherwise);
  19. social rating – of sites, of products, of places and of experiences – will assume a marketing prominence;
  20. universal identities – the same identity used on all social networks (most likely integrated with security a la OpenID):
  21. a single social graph, integrating not only “standard” social network sites such as Facebook and LinkedIn but also all of email, Skype, IM etc;
  22. more platform facilities and uses, for applications, games etc within a social networking site (such as the Facebook Platform);
  23. greater integration with mainstream business applications – packages and custom built systems. Email, database management systems, document management systems, content management systems, and many others will integrate collaboration and social networking cabailities.

No Comments

The Future of Computing (Now) – Cloud Computing

What is Cloud Computing?

In simple terms, cloud computing builds on the foundations of virtualised resources (compute resources, storage resources, network resources), providing an additional level of configuration and control across multiple virtual environments, as well as the capability of implementing “self service” facilities.

“Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

This cloud model promotes availability and is composed of:

  1. five essential characteristics:
    1. On-demand self-service;
    2. Broad network access;
    3. Resource pooling;
    4. Rapid elasticity;
    5. Measured Service;
  2. three service models:
    1. Cloud Software as a Service (SaaS);
    2. Cloud Platform as a Service (PaaS);
    3. Cloud Infrastructure as a Service (IaaS); and,
  3. four deployment models:
    1. Private cloud;
    2. Community cloud;
    3. Public cloud;
    4. Hybrid cloud.

Key enabling technologies include:

  1. fast wide-area networks;
  2. powerful, inexpensive server computers; and
  3. high-performance virtualization for commodity hardware.”

(Source: NIST)

“Cloud computing is a category of computing solutions in which a technology and/or service lets users access computing resources on demand, as needed, whether the resources are physical or virtual, dedicated, or shared, and no matter how they are accessed (via a direct connection, LAN, WAN, or the Internet). The cloud is often characterized by self-service interfaces that let customers acquire resources when needed as long as needed. Cloud is also the concept behind an approach to building IT services that takes advantage of the growing power of servers and virtualization technologies.”  (Source: IBM)

Cloud Computing is now one of the “hot topics” in ICT.  Almost all major vendors have some semblance of a cloud computing offering, however that may be defined (since, as with most “hot topics”, vendors and others define an amorphous term such as cloud computing in a manner which best suits their interests).

Other terminology is sometimes used in conjunction with (and sometimes, erroneously, synonymous with) cloud computing.  The terms SaaS (Software as a Service), PaaS (Platform as a Service) and IaaS (Infrastructure as a Service) can all be considered as sub-variants of the more generic term “cloud computing”.  The diagram in this IBM introductory material further elucidates these differences.


Public, Private and Hybrid Clouds

“In general, a public (external) cloud is an environment that exists outside a company’s firewall. It can be a service offered by a third-party vendor. It could also be referred to as a shared or multi-tenanted, virtualized infrastructure managed by means of a self-service portal.

A private (internal) cloud reproduces the delivery models of a public cloud and does so behind a firewall for the exclusive benefit of an organization and its customers. The self-service management interface is still in place while the IT infrastructure resources being collected are internal.

In a hybrid cloud environment, external services are leveraged to extend or supplement an internal cloud.”  (Source: IBM)

Diagrammatically, the three (3) types of cloud computing offerings can be depicted as:

Cloud Computing Types

(Source: Sam Johnston)

Private Clouds

“Private clouds presents (sic) a shift from a model where everything is customized to one of standardization. Management in such an environment is no longer about avoiding change but instead embracing it to facilitate IT’s twin goals: delivering on the needs of the business and managing the underlying resources in the most efficient way possible.

The move to private cloud represents an industrial revolution for IT, applying industrial manufacturing techniques to the provisioning of IT services, gaining standardization and automation.

Standardization is central to achieving much greater operational efficiency.  Private clouds not only facilitate standardization but dramatically increase the returns on standardization. Deploying standard infrastructure from a templatized catalog of applications is orders of magnitude faster and easier than building each application from scratch. Similar gains are available from centralizing and standardizing high availability, network management, and security.

To take advantage of the cloud, there needs to be a clear separation of the production versus consumption layer. In the cloud, the consumer (the business) has no idea – and importantly, little interest in or concern with – what hardware platform and management tools are being used to deliver services.”  (Source: VMWare)

It should be noted that some commentators (for instance, Sam Johnston in his “Random rants about tech stuff (cloud computing, intellectual property, security, etc.)“) suggest that Private Clouds are a neologism to justify various vendors offerings in competition to the “pure” Public Cloud model.  Nevertheless, even these commentators acknowledge that Private (and Hybrid) Clouds are likely to be used into the immediate future as organisations come to grips with a new way of providing computing facilities.


What Should Run in the Cloud?

Since the “cloud” can effectively implement any computing environment (operating systems, etc), then basically anything could be run in the cloud.  As with most things in life, just because it is possible does not necessarily make it either desirable or useful (or even usable).

Typically, highly interactive applications may best operate using a desktop or workstation environment (such as high end graphics manipulation, high end development IDEs, etc).  But the boundaries between a pure cloud environment and a pure desktop environment (and, now, even a pure mobile environment) are becoming increasingly blurred.  In many instances, what were previously only desktop applications now are connected to cloud facilities, typically for storage of data, but also for additional processing capabilities (for instance, to render complex images using the additional compute resources available in cloud facilities).  In the same manner, mobile applications will store (synchronise) data using a cloud facility, thereby allowing a single view of one’s data whether using a web based interface (into the cloud), a mobile device interface (ie on a smartphone) or a desktop interface (ie a MS Windows application).

In addition, applications with extremely sensitive security profiles would most likely not be run in a public cloud or hybrid environment (although could readily be conceived as operating in a secure private cloud environment).

Everything else is amenable to cloud based operation.


How big is Cloud Computing?

An interesting infographic from the Wikibon site and its blog provides an insight into the current and projected size of cloud computing, including the economics of why cloud computing is here to stay …

How Big is the World of Cloud Computing?
Via: Wikibon

No Comments

The Future of Computing (Then) – The Internet of Things

Computing is no longer about a massive server sitting in a special purpose room, with PCs connected to it via a local network.  Nowadays, computing is everywhere and in everything.  Ordinary items (white good consumer items, motor vehicles, etc etc) all have computing power within them.  Increasingly, this computing power also incorporates network connectivity (typically, of a wireless kind).

The future of living is about things in our environment all talking to each other and to our applications.  Think solar power and electricity generation and consumption – all monitored and controlled using computer enabled “things”.

As an example, consider GreenGoose.  From their website (and other sources):

“Sensors measure actions you take to reach goals that you select. They’re wireless, battery-powered, and all a little different.  The exercise sensor is the size of a credit card and slips into your wallet, purse or backback. The others are stickers that you just stick on to things like a water-bottle, toothbrush or floss.  You stick these sensors on your bike, thermostat, showerhead “and even your keychain”.  Each one measures a different thing you do, but they all communicate with the same egg-sized base-station.  They communicate with a gateway you plug into your broadband router. Installation takes less than five minutes and you can do it yourself.  GreenGoose lets you set simple lifestyle goals. Track your own progress automatically with sensors.  Earn lifestyle points the more often you do things. Bonus points for consistency.  Share or exchange points with other applications, or partners offering rewards – or even an allowance for kids.  Eventually this type of connection, between sensors and mainstream services like banking, will be commonplace and probably won’t need to rely on gimmicks such as green eggs. But for now, Green Goose seems like a cute, interesting Internet of Things service for green conscious early adopters to try out.”

Basically, GreenGoose is all about connecting various things (typically exercise type things) together into the (computing) network, such that activity is automatically measured and recorded – and then used for other purposes (in this instance, rewarding oneself for the completion of exercise activity).

No Comments