Archive for category Computing

ASUS Eee Pad Transformer and WiFi Connection

So, I got an ASUS Eee Pad Transformer last weekend.  A beautiful little tablet computer, which docks into a very nice keyboard so easily and solidly that you just want to leave it docked. Indeed, when it is docked, it is like a small notebook – slim, tidy, good looking and very very light.
It is very easy to carry, even with the keyboard.  The touch screen works well, and you can readily type when it is sitting on your lap, even at speed.

It runs Android 3 – so there is a bunch of software available, including many of my favourite online apps which I have been using on the web and on my iPhone. And some of the functionality in the new Android apps is very good.

As you can see, I am quite happy with the new machine (I am writing this post on it at this very moment).

I do have a whinge though.

What is it about software developers?  Don’t they ever learn?

The WiFi connection capabilities of both the ASUS Eee Pad and Android leave a lot to be desired.

Firstly, my encounter with the ASUS WiFi.

The machine worked perfectly well in the shop.  Got it home and everything worked except it would not connect to the WiFi. Nothing I tried would work. It just would not see my nework, although it could see other people’s networks.  I went to someone else’s house and it connected to their network.

Some investigating on the internet (luckily I have many other computers available to do this investigation, and I am quite good at doing this research) revealed that it appears that the ASUS is programmed to only connect to WiFi on channels 1 thru 11 – any channel used above that will not be found.  It appears that in the US, no channels above 11 are used, whereas channels 12 and 13 are extensively used elsewhere in the world. So the dumb-ass programmers who think that there is no other place in the world than the US did not bother to think about making it work everywhere, nor testing it outside their lab.

I had to go into my WiFi Router and change its settings so that it did not use 802.1n but rather 802.1b/g only.  After rebooting everything, the ASUS Eee Pad connected and everything was fine.

For a while!

A couple of days later, the machine just would not connect to the WiFi again – but this time, it was even worse.  Every time that it tried to “Obtain an IP Address” it would reboot. Time after time after time. Absolutely frustrating.  This time I thought there must be something wrong with the hardware, so I rang the ASUSTek support line.  They told me that I need to reload the Android kernel image, since sometimes the WiFi connection settings are overwritten with bogus values and it causes the machine to reboot.

So I did the kernel image reset a number of times. Seemed to work once or twice and then back to the same rebooting behaviour.  Since this was now the weekend, I did some research in the internet again and read about some sort of weird behaviour relating to not getting addresses through the DHCP server on the WiFi Router, or something.  Not directly applicable, but close enough for me to follow up this lead.

Investigating both the WiFi Router and the Internet Modem Router (a legacy of the old setup I inherited at home), I noticed that the Modem Router was serving IP Addresses in a range from one address that was the WiFi Router address, upwards.  Thus, there was potentially a conflict. Well, NOTHING else that has connected to the network over the last couple of years has had a problem with this potential conflict.

Along comes Android, sees that there is some sort of problem (how, I am not sure at all) and then decides to REBOOT when it sees this problem.


I ask you.  Couldn’t the programmers think of, what, an ERROR MESSAGE. Say, “Error obtaining IP Address” at its simplest.  Or even “Conflict in IP Address resolution”.  Displayed this message and then continued operation. How simple would that be.  Maybe they could be smart enough to move on and obtain another IP Address and avoid the problem entirely (I know, they will argue that my WiFi Router and Modem Router should not have been setup like that in the first place – but hey, everyone else seemed to be able to handle the situation).

But to REBOOT the machine. Automatically.  You have GOT TO BE KIDDING!!!

This strikes me as the stupidest piece of coding that I have ever come across in my life.  The people at Google that write this code should be ashamed of themselves.

It is another example of how the ICT industry shoots itself in the foot all the time.  And an example of why Apple is now the number 1 ICT company (by market capitalisation) in the world today.  Because they do try hard to design tools which are usable by anybody and everybody and do not suffer these types of issues.

What ordinary person is going to know what DHCP means? (Damn Huge Crappy Programming probably).
What ordinary person is going to know how to change settings inside a WiFi Router and a Modem Router, hidden deep in “Advanced” menus, to somehow get a WiFi connection going?
What ordinary person is going to want to know about a Kernel Image Reset – why on earth would they need to do something like that?

Honestly, I love the Android concept, and Google, and all that stuff. But you have to do better than this.

I am not giving up the machine – it is too nice. But golly, it makes it hard to recommend to novices – you know – all those consumers who make the bulk of the populace that you (Mr ASUS and Mr Google) want to sell too.

No Comments

The Future of Computing (Now and Then) – Visualisation

Visualisation, as the name implies, is about methods or techniques of displaying (visualising) data, in such a manner that the result is further and better understanding of the underlying information or knowledge of the data, what the data is “imparting” to the audience.

Visualisation is closely aligned with Analytics in many ways, in that Visualisation is all about how to present large volume and complex data in such a manner that it is (relatively) simple for humans to understand the underlying message, or pattern, represented or conveyed by the data. Typically, the analytics process will apply algorithms to process the data in a variety of manners, hopefully obtaining a suitable result – which then needs to be presented such that decisions can be made and further action can be taken. In many instances, the analytic process itself may involve Visualisation in order to allow the people performing the analytics to determine what to do next.

These insights into the data – what it is trying to tell us – are imparted through the power of the human mind, through its ability to make connections and understandings according to visual cues (and prior knowledge). Interestingly, the ability for humans to understand based on visualisation is mainly achieved through additional computing processing – advanced visualisation typically involves advanced information processing (and sometimes advanced hardware for particular visualisation purposes, such as 3D displays, large screen projections, etc). It should be noted though, that visualisation has been around for quite some time – early maps are a form of visualisation – producing a visual representation of some data to assist in understanding (see

In the past, Visualiation has been part of Business Intelligence (think charts, graphs, dashboards), but in a rather simple and simplistic manner. Modern visualisation uses all the power of graphics and animation (including 3D) to present a compelling vision for decision making. Visualisation today is much much more than a few charts and graphs. It is many different types of graphical representations; it is animated timelines (combined with multiple graphic types); it is now 3D (both static and animated) and it is interactive (the initial visualisation can allow those viewing to select an element, which will then query for new data (for instance, more detailed information which is then visualised (possibly using a different mode of visualisation) and allows for further interaction).

Some disciplines and specialist areas only operate based on visualisation. Areas include:

Computed Axial Tomography (CAT) and Magnetic Resonance Imaging (MRI) scans in medicine only “work” because the massive amounts of data generated are presented in a visual form for the specialist physician to interpret. The physician never works with the raw data – only with a computed representation ( Astronomers now regularly use visualisation to process the huge amounts of data generated by modern telescopes (current estimates suggest that this data stream will exceed 1 Terabyte of data per day in the near future – see and to visualise and understand how the universe works – and what it looks like, from a variety of perspectives – not simply in the human visual spectrum, but also in the infrared and ultraviolet wavelengths, as well as gamma radiation and other signals. Geospatial data, such as topography, hydrography, etc, are all presented in terms of visualisation – as simple as a mapping display, up to as complicated as real-time 3D animation through a timeline.

Visualisation today is enabled by hardware advances, specifically the inclusion of Graphical Processing Units (GPUs) in computers (specifically PC based hardware) – offloading the processing for visualisation from the standard CPU onto a dedicated and high-powered chip (see; and

An excellent visualisation graphic of the different types of visualisations which can be produced, categorised into six different types (data; information; concept; strategy; metaphor and compound) has been produced by Visual Literacy (which provides e-Learning tutorials on visualisation – see and is available at The web page is interactive, displaying an example of each type of visualisation when mousing over the entry box in the table. An excellent example of how to do visualisation well!

Some of the specific techniques used for visualisation include:

  1. a Cladogram (for display of phylogeny – see;;;;;
  2. a Dendrogram (for display of classifications – see;;;
  3. Graph drawing (;;;
  4. Heat-maps (;;;;
  5. Hyper Trees (;;;;;
  6. Treemapping (;;;

In the future, such visualisation will link to differing models of Human Computer Interaction (HCI), including various forms of haptics (“The science of applying tactile sensation to human interaction with computers” – source: Also see: and immersive technologies, such as the multi-touch desktop (see – similar to the technology hypothesized in the movie The Minority Report, based on the short story of the same name by Philip K. Dick).

No Comments

Over-estimation, Under-estimation

There’s a saying in the technology industry: People tend to overestimate what will happen two years from now and underestimate what will happen in 10.

No Comments

The Future of Computing (Now) – Social Networking

Social Networking is Facebook, right?

Well, not quite.
It is a lot more than teenagers posting updates on Facebook.
It is definitely about business – and the general public – now.
A recent commentary (, as at May 2011) stated:
“With the dramatic rise in use of social media, entrepreneurs were decidedly placing themselves in one of two camps: those who saw NO potential value in social media and those who jumped on the bandwagon without a clue to where the train was heading. Game Over. Now, after a lot of experimentation and evaluation by experts and novices alike, it seems there are some real opportunities in social media for businesses of all sizes.”

Social networking is about instant communication, within an extended social setting (rather obviously). It is about stating “Your Message” – directly and instantly, to a wide group of people, rather than through some intermediary.

Historically, the intermediary is a publisher of some sort, some business which determines what is published to the wider community and what is not.


Now, one has the opportunity to build one’s own community of like minds – as large or as small as one desires (it should be noted that social networking does NOT mean that one automatically gets a “free” community. Building a community requires effort and work. If one desires a large community, one must expend substantial effort to obtain that community. The point about modern social networking is that it is possible to build such a large community with less capital outlay than was ever required previously – making the ability to reach such large communities within the hands of every individual – provided they so desire and make the requisite effort.

In many ways, this is similar to the advances in music production bought about by the digital age. Today, everyone has the capacity to produce high quality music. Not everyone does so, though, nor, for those that do, they are heard. Why not? Because (1) most people are just not interested in producing music (they would rather just listen to it); (2) many people, even though they may be technically capable of operating music studio production software, do not have the requisite talent, experience nor application to produce anything of general worth; (3) many people can and do produce excellent music, yet few hear it – simply because the steps associated with making this music available to a wider audience are not taken; and (4) even if some-one makes their music available, many will not hear it, due to the glut of content available, meaning that distribution and marketing take precedence in terms of music awareness – which returns us to the issue of social networking – another means of making a wider community aware of one’s content).

Social Networking Facilities

Facebook not the first

Most might consider that social networking began with the advent of Facebook in September 2006. Although Facebook would no doubt classify as the largest “social networking” site existing currently (as at 2011, with more than 500 million users world-wide), it was not the first such entity (indeed, Friendster (started in 2002 – now concentrating on social gaming) and MySpace (started in July 2003 – now concentrating on music and bands) pre-dated Facebook and were much more successful early on but were soon eclipsed by their successors) and is certainly not alone in terms of providing social networking facilities (other examples include Bebo (formed in 2005 Indeed, many web-based / cloud computing based applications offer collaboration / social networking capabilities as either core or adjuncts to their offerings. It appears that if a product is not “socially”-enabled then it will not sell.


Early examples of social networking were Blogging – a “en-verbened” and shortened form of the phrase “web-log” – a facility whereby one maintained a log of events / occurences / thoughts / ideas / writings / anything of interest – on the web (as a series of web pages). Specific software to allow people to create (post) and display their blog entries were soon created, with an early contender being Blogger ( – founded in August 1999, now a Google property, and also known as Blogspot), soon followed by other major blogging platforms (WordPress in 2003; Typepad on 6 August 2003; Movable Type in 2001; and Live Journal in 1999


Blogging was followed by micro-blogging – the best known example being Twitter (, formed in 2006 / 2007). The concept of a micro-blog is to keep the entry down to a minimum – in the case of Twitter, to 140 characters – the size of a SMS (Short Message Service) message from telephony minus 20 characters to allow for special addressing etc, such that a micro-blog entry could be made from a mobile phone. Twitter is not the only major micro-blogging platform (others include Status Net (formed in July 2005 and its public face, called (; and Plurk (formed in January 2008 Other products exhibit micro-blogging qualities, through their status update capabilities (such as in Facebook, but also, for instance, Google Buzz, announced on 9 February 2010, which integrates with most Google products as well as a range of external offerings).


And not to be left out, there is also the concept of mini-blogs – a cross between a full blog and a micro-blog, combining the ease and immediacy of micro-blogging with the extended entry capacity of blogging (longer text, photo’s, video’s, etc). The major examples in the mini-blogging niche include Tumblr (formed in 2007; and Posterous (formed in July 2008

Secondary Support Applications

The success of the various blogging / micro-blogging / mini-blogging platforms has led to the creation of a vibrant secondary support industry of individuals and organisations writing applications which integrate and work with the major products. As an example, there are a range of products which allow one to view all the “streams” of information to which one subscribes – one’s Facebook stream; one’s Twitter stream; one’s blogging news feed and other such sites. Such products include Hootsuite (; TweetDeck (; Stroodle ( and the Ubermedia apps (

People and Place

Social networking is not only about connecting people, but also about connecting people and place (see Location based social networking is enabled by smart-phones with both GPS and 3G/4G internet capabilities – people are always connected, no matter where they are.

Possibly the most pre-eminent site in this category is FourSquare (, allowing people to check-in to places and comment on what they are doing there, or what they experience (say, comment on a restaurant, or service from a shop, or being at a concert, etc). Google has a similar product called Google Latitude ( Other examples include Gowalla (, specifically targeted at travel and exploration; and Yelp (, which bills itself as “the fun and easy way to find and talk about great (and not so great) local businesses”. Indeed, even Facebook is now in on the act, with its Facebook Places (


Finally, don’t forget that social media is not just text – it is all types of media, most notably video. YouTube ( is the most famous – or possibly, infamous – of the video sharing sites, which not only allows anyone to upload video, but for others to comment on and share videos. The measure of popularity of an event or item can be measured by the number of Youtube “views” that a video will receive. Another well used video sharing site is Vimeo (, similar in concept to Youtube (allowing uploading and sharing of videos), but possibly more oriented towards information sharing, particularly in a business context. It headlines itself as: “Vimeo is a respectful community of creative people who are passionate about sharing the videos they make. We provide the best tools and highest quality video in the universe.”, indicating its focus on a defined community, as opposed to the “free-for-all” which could characterise YouTube.

These types of social networking are not in isolation from each other any longer. People share their Youtube videos using Facebook. They receive their entertainment (TV shows, viral video clips, information and infotainment) through Youtube, which they then share with their friends. Breaking news is now Twitter and Facebook and Youtube – the news is witnessed by, photographed or video’ed by people, on the spot, using their mobile phones, immediately uploaded and instantly available world-wide.

Another type of multi-media based social networking tool is Skype (, allowing individuals and groups to make voice and video calls across the internet, integrated with a set of contacts.

Social Networking and Business

Social networking is not limited to individuals or personal matters. As mentioned above, and throughout this article, social networking is increasingly being used as a business tool (eg for marketing purposes) and within businesses themselves. To provide a more “serious” description of social networking for business use, it is also known as “Enterprise Social Software”. From a business perspective: “Social media at its core is all about having a dialogue with your customers – it’s about people investment.” – Blake Cahill (Principle at Banyan Branch, quoted from

Blogging and wiki capabilities for business are provided by a wide variety of systems, commercial and open source – which can be installed and operated in-house or using a software-as-a-service model. Systems such as WordPress, Typepad etc can be used in this manner. Micro-blogging facilities are available from a number of sources, such as StatusNet and Yammer (formed in September 2008 which is billed as a “free private social network for your company”.

Social Networking for Professionals

And there are specific social networking sites just for professionals. The largest is LinkedIn (formed in May 2003 is where “Over 100 million professionals … exchange information, ideas and opportunities” (according to its website). XING (formed in August 2003 also bills itself as a professional business network. Both these sites are about making connections between professionals, particularly with respect to job hunting and career development, but also in terms of maintaining contact with one’s network.

And the final site is a hybrid business / individual social networking site, specifically for maintaining an address book, called Plaxo (

Social Networking and Applications

As mentioned above, many products with other tasks / functions at their core are increasingly adding collaboration and social networking / media capabilities to their offerings, since, even if not an integral component of the functionality of the product, these facilities must be provided for the product to sell. Prominent examples in the area include the myriad of project and task management products available, as well as CRM, DMS, CMS and other such suites.

Social Networking is Mainstream

As evidence of the rise of social networking, Facebook has become so ubiquitous that mainstream companies, who have nothing specific to do with ICT, are creating Facebook accounts and pages, displaying their details in their advertising – such as on business cards,in print ads and on TV. Car companies, banks, insurance companies (and increasingly, everyone else) are now using Facebook as their consumer entry “page”. One’s local car yard, one’s local plumber all have Facebook pages.

Social Networking – Today and in the Future

What is social networking and social media being used for today (and in the future)?

Some elements include:

  1. one-to-one conversation;
  2. one-to-intimate group “conversation” / “discussion” / “information dissemination”;
  3. event- notification, organisation, attendance;
  4. location based notifications, and recommendations etc;
  5. marketing, advertising and selling;
  6. SEO (Search Engine Optimisation) enablement;
  7. disseminating materials, such as brochuresm eBooks, etc;
  8. photo albums and video collections;
  9. match-making;
  10. emergency management – notifying communities and constituents during flood, fire and earthquake, etc;
  11. political advertising and commentary;
  12. political activism – dissent (and assent, if you will);
  13. automatic updating of location and connectivity, incorporating geo-mobile technologies and “internet of things” technologies;
  14. automatic understanding of one’s environment (facial recognition, for instance) and context (through natural language processing);
  15. greater decentralisation – many elements of social networking will inter-operate;
  16. search engines will embrace all social networking interactions;
  17. content aggregation (introducing a new element of electronic intermediation) will work towards sifting, analysing and presenting the mass of online data from a wide variety of sources (including all social media sources) into a useable form for the individual (and businesses);
  18. greater use of analytics – particularly big data and deep analytics, sifting through the massive amount of data generated by social networking tools and systems, to identify patterns and understand what is happening, or what is relevant to one’s area of interest (business or otherwise);
  19. social rating – of sites, of products, of places and of experiences – will assume a marketing prominence;
  20. universal identities – the same identity used on all social networks (most likely integrated with security a la OpenID):
  21. a single social graph, integrating not only “standard” social network sites such as Facebook and LinkedIn but also all of email, Skype, IM etc;
  22. more platform facilities and uses, for applications, games etc within a social networking site (such as the Facebook Platform);
  23. greater integration with mainstream business applications – packages and custom built systems. Email, database management systems, document management systems, content management systems, and many others will integrate collaboration and social networking cabailities.

No Comments

The Future of Computing (Now) – Cloud Computing

What is Cloud Computing?

In simple terms, cloud computing builds on the foundations of virtualised resources (compute resources, storage resources, network resources), providing an additional level of configuration and control across multiple virtual environments, as well as the capability of implementing “self service” facilities.

“Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

This cloud model promotes availability and is composed of:

  1. five essential characteristics:
    1. On-demand self-service;
    2. Broad network access;
    3. Resource pooling;
    4. Rapid elasticity;
    5. Measured Service;
  2. three service models:
    1. Cloud Software as a Service (SaaS);
    2. Cloud Platform as a Service (PaaS);
    3. Cloud Infrastructure as a Service (IaaS); and,
  3. four deployment models:
    1. Private cloud;
    2. Community cloud;
    3. Public cloud;
    4. Hybrid cloud.

Key enabling technologies include:

  1. fast wide-area networks;
  2. powerful, inexpensive server computers; and
  3. high-performance virtualization for commodity hardware.”

(Source: NIST)

“Cloud computing is a category of computing solutions in which a technology and/or service lets users access computing resources on demand, as needed, whether the resources are physical or virtual, dedicated, or shared, and no matter how they are accessed (via a direct connection, LAN, WAN, or the Internet). The cloud is often characterized by self-service interfaces that let customers acquire resources when needed as long as needed. Cloud is also the concept behind an approach to building IT services that takes advantage of the growing power of servers and virtualization technologies.”  (Source: IBM)

Cloud Computing is now one of the “hot topics” in ICT.  Almost all major vendors have some semblance of a cloud computing offering, however that may be defined (since, as with most “hot topics”, vendors and others define an amorphous term such as cloud computing in a manner which best suits their interests).

Other terminology is sometimes used in conjunction with (and sometimes, erroneously, synonymous with) cloud computing.  The terms SaaS (Software as a Service), PaaS (Platform as a Service) and IaaS (Infrastructure as a Service) can all be considered as sub-variants of the more generic term “cloud computing”.  The diagram in this IBM introductory material further elucidates these differences.


Public, Private and Hybrid Clouds

“In general, a public (external) cloud is an environment that exists outside a company’s firewall. It can be a service offered by a third-party vendor. It could also be referred to as a shared or multi-tenanted, virtualized infrastructure managed by means of a self-service portal.

A private (internal) cloud reproduces the delivery models of a public cloud and does so behind a firewall for the exclusive benefit of an organization and its customers. The self-service management interface is still in place while the IT infrastructure resources being collected are internal.

In a hybrid cloud environment, external services are leveraged to extend or supplement an internal cloud.”  (Source: IBM)

Diagrammatically, the three (3) types of cloud computing offerings can be depicted as:

Cloud Computing Types

(Source: Sam Johnston)

Private Clouds

“Private clouds presents (sic) a shift from a model where everything is customized to one of standardization. Management in such an environment is no longer about avoiding change but instead embracing it to facilitate IT’s twin goals: delivering on the needs of the business and managing the underlying resources in the most efficient way possible.

The move to private cloud represents an industrial revolution for IT, applying industrial manufacturing techniques to the provisioning of IT services, gaining standardization and automation.

Standardization is central to achieving much greater operational efficiency.  Private clouds not only facilitate standardization but dramatically increase the returns on standardization. Deploying standard infrastructure from a templatized catalog of applications is orders of magnitude faster and easier than building each application from scratch. Similar gains are available from centralizing and standardizing high availability, network management, and security.

To take advantage of the cloud, there needs to be a clear separation of the production versus consumption layer. In the cloud, the consumer (the business) has no idea – and importantly, little interest in or concern with – what hardware platform and management tools are being used to deliver services.”  (Source: VMWare)

It should be noted that some commentators (for instance, Sam Johnston in his “Random rants about tech stuff (cloud computing, intellectual property, security, etc.)“) suggest that Private Clouds are a neologism to justify various vendors offerings in competition to the “pure” Public Cloud model.  Nevertheless, even these commentators acknowledge that Private (and Hybrid) Clouds are likely to be used into the immediate future as organisations come to grips with a new way of providing computing facilities.


What Should Run in the Cloud?

Since the “cloud” can effectively implement any computing environment (operating systems, etc), then basically anything could be run in the cloud.  As with most things in life, just because it is possible does not necessarily make it either desirable or useful (or even usable).

Typically, highly interactive applications may best operate using a desktop or workstation environment (such as high end graphics manipulation, high end development IDEs, etc).  But the boundaries between a pure cloud environment and a pure desktop environment (and, now, even a pure mobile environment) are becoming increasingly blurred.  In many instances, what were previously only desktop applications now are connected to cloud facilities, typically for storage of data, but also for additional processing capabilities (for instance, to render complex images using the additional compute resources available in cloud facilities).  In the same manner, mobile applications will store (synchronise) data using a cloud facility, thereby allowing a single view of one’s data whether using a web based interface (into the cloud), a mobile device interface (ie on a smartphone) or a desktop interface (ie a MS Windows application).

In addition, applications with extremely sensitive security profiles would most likely not be run in a public cloud or hybrid environment (although could readily be conceived as operating in a secure private cloud environment).

Everything else is amenable to cloud based operation.


How big is Cloud Computing?

An interesting infographic from the Wikibon site and its blog provides an insight into the current and projected size of cloud computing, including the economics of why cloud computing is here to stay …

How Big is the World of Cloud Computing?
Via: Wikibon

No Comments

The Future of Computing (Then) – The Internet of Things

Computing is no longer about a massive server sitting in a special purpose room, with PCs connected to it via a local network.  Nowadays, computing is everywhere and in everything.  Ordinary items (white good consumer items, motor vehicles, etc etc) all have computing power within them.  Increasingly, this computing power also incorporates network connectivity (typically, of a wireless kind).

The future of living is about things in our environment all talking to each other and to our applications.  Think solar power and electricity generation and consumption – all monitored and controlled using computer enabled “things”.

As an example, consider GreenGoose.  From their website (and other sources):

“Sensors measure actions you take to reach goals that you select. They’re wireless, battery-powered, and all a little different.  The exercise sensor is the size of a credit card and slips into your wallet, purse or backback. The others are stickers that you just stick on to things like a water-bottle, toothbrush or floss.  You stick these sensors on your bike, thermostat, showerhead “and even your keychain”.  Each one measures a different thing you do, but they all communicate with the same egg-sized base-station.  They communicate with a gateway you plug into your broadband router. Installation takes less than five minutes and you can do it yourself.  GreenGoose lets you set simple lifestyle goals. Track your own progress automatically with sensors.  Earn lifestyle points the more often you do things. Bonus points for consistency.  Share or exchange points with other applications, or partners offering rewards – or even an allowance for kids.  Eventually this type of connection, between sensors and mainstream services like banking, will be commonplace and probably won’t need to rely on gimmicks such as green eggs. But for now, Green Goose seems like a cute, interesting Internet of Things service for green conscious early adopters to try out.”

Basically, GreenGoose is all about connecting various things (typically exercise type things) together into the (computing) network, such that activity is automatically measured and recorded – and then used for other purposes (in this instance, rewarding oneself for the completion of exercise activity).

No Comments

Email Tone

Re-posted (with some additional comments) from, with kind regards.

[According to Daniel] Goleman, author of Social Intelligence and godfather of the field of Emotional Intelligence, … there’s a negativity bias to email – at the neural level. In other words, if an email’s content is neutral, we assume the tone is negative.  In face-to-face conversation, the subject matter and its emotional content is enhanced by tone of voice, facial expressions, and nonverbal cues.  Not so with digital communication.  Technology creates a vacuum that we humans fill with negative emotions by default, and digital emotions can escalate quickly (see: flame wars).  The barrage of email can certainly fan the flames.  In an effort to be productive and succinct, our communication may be perceived as clipped, sarcastic, or rude.  Imagine the repercussions for creative collaboration.

Tools are already emerging to address this phenomenon.  See ToneCheck, a “tone spellcheck” app that scans emails for negativity and then helpfully suggests tweaks to make your communication more positive (featured in The New York Times Magazine’s annual Year in Ideas issue).

[The following are some] simple ways to encourage positive digital communication … :

1. Heed the negativity bias. In this case, awareness and attention goes a long way. Consider how your communication may be perceived. Can you be more explanatory? Is your language positive as opposed to neutral?

2. Pay attention to your grammar. [When writing emails in haste (and sometimes not)], meaning is often obscured by simple grammatical confusion. “That’s not what I meant” is emblematic of digital miscommunication, and can escalate a problem quickly. Re-read your emails before sending, and make sure your intended message is being conveyed clearly.

3. Consider emoticons. Until keyboards can actually perceive the emotional content of our digital messages (not so far off!), emoticons may be the simplest method of clarifying tone. … let go of [the] … perception that emoticons are silly. They may currently be our best tool for elevating the emotional clarity of digital messages.

4. Use phrasing that suggests optionality. Email is not a great medium for delivering criticism, but sometimes it’s unavoidable. If you want your message to be well-received, try using phrasing that empowers the receiver. Questions in particular tend to be better received than declaratives (which can seem accusatory). If you’ve noticed a team member overlooked a task, you might email them: “Are you planning to take care of that issue?” Rather than making them feel put upon, you give them agency. [Mind you, a severe questioning tone can be even more detrimental than a direct statement of fact (as long as that statement is phrased “nicely”).  The message here is to think very carefully as to the appropriate phrasing for the situation at hand.]

5. Start things off on the right foot. When the news is mixed, consider leading off your message with an expression of appreciation. Then follow with the meat of your response. It could be something as simple as, “We’re off to a great start, I just have a few small tweaks I want to suggest.” Such gestures may seem like fluff, but they set the tone. Effectively saying “I appreciate the work you’ve already done…” prior to bringing the feedback that means “back to the drawing board!” [And then follow up the “meat” of the email with another statement of praise or appreciation.  If possible, different from the one which started the email.  Make the praise and appreciation sincere – not a “cardboard facsimile” of emotion (that will just inflame the situation).].

6. Jettison email… maybe. Ask yourself, “Is email the best carrier of this message?” Often a more social communication tool such as an internal project management space or messaging tool (Yammer, Action Method, or Mavenlink) can be more appropriate and serve as an emotional buffer. Reactive communication tends to be more measured in a public digital space. Plus an added bonus: knowledge sharing. [Except, be very very careful about posting any direct one-on-one and personal feedback and communication on social sites.  Social sites, and especially short message sites, suffer from the same problems as email, sometimes even more so – because they will typically be read by many more people, and probably read by people who do not have the same context surrounding the situation as maybe the two individuals involved in the email.  There are many situations where what appeared to be a simple communication on a shared social site was badly misinterpreted and caused “all-out warfare” on a project.  Finally, don’t forget that digital communication is the only means of communicating.  The telephone still works.  And so do face-to-face meetings (although,m it is admitted, that with today’s global business, face-to-face meetings may be rather too expensive or not even possible).  Make sure you keep the NLP Presupposition always in mind: “We are always communicating, in all channels”.  Think of the means of communication that would be best for the purposes, before attempting the communication.].

Because of the lack of emotional tone in emails, we often have to go the extra mile to convey a solicitous attitude  – whether it’s rewriting a sentence, adding an emoticon, or offsetting bad news with a positive remark.  Even if it seems a chore, it’s time well spent.

In the immortal words of a recent 99% commenter: Don’t treat others like a “DO IT” button, treat them like human beings.

No Comments

IBM Watson and Jeopardy

IBM’s Thomas J Watson Research Center in Yorktown Heights, NY has recently completed a new grand challenge – to program a computer to play the quiz game “Jeopardy”.

I have been following this (as have many many other people) – and it has been absolutely fascinating.
This link ( to a youtube of the final session should also give you links to the previous sessions over the 3 days. The link to details of Watson ( will no doubt also give you the relevant video links, and much more.  More information can also be read at Mashable in an article on Watson and interview with Stephen Baker.

Basically, IBM have developed a natural language processing and deep analytic question and answer system, using massively parallel processing and huge amounts of memory (as stated here: 2880 processor cores in 90 Power 750 computers and 15 terabytes of RAM) to implement a system which can answer any sort of general knowledge question (which have been asked in a variety of ways, including through association, analogy, puns, etc), and to get so many correct that Watson totally beat the best human players.

The results were fascinating.

At the end of the first day, Ken Jennings was on $4,800, Brad Rutter was on $10,400 but Watson was a massive $35,734 (I also answered the questions as they appeared on the screen and achieved $22,400 – although one can not completely equate the results, since the physical presence of having to press the button first when the light comes on and then answer was not the same for me watching it on a computer screen).

At the end of the second and final day, Brad scored $5,600 before final jeopardy, wagered the lot to obtain $11,200 which totaled him $21,600 over the 2 days.

Ken did much better on the second day, managing a pre-final jeopardy score of $18,200 but only wagered $1,000 to finish with $19,200, to total $24,000 for the 2 days.

But Watson. Well, he (since we can really be anthropomorphic here) scored $23,440 before final jeopardy, wagered $17,973 to make his daily score $41,413 and a massive total of $77,147 for the 2 days.

(By the way, I managed $14,000 for the second day, wagered the lot and got the final jeopardy answer correct (it was Bram Stoker) to finish with $28,000 on the day and $50,400 over the 2 days).

The prize money of $1,000,000 awarded to Watson was donated by IBM to World Vision and to the World Community Grid, whereas half the second prize of $300,000 (to Ken) and $200,000 (to Brad) was donated to other charities.

Two important take-aways from this brilliant piece of research.

Firstly, this technology from IBM has so so many uses – not just in the medical field (as the first offerings appear to be) but also in the energy and resources fields, the urban planning fields, and certainly in the legal and justice fields. The ability to ingest natural language materials (such as legislation, case law, briefs, submissions, depositions, statements, judgments and miscellaneous other materials) and then to answer complicated questions concerning that material (and link to associated material not previously related to the matter) will be extremely important in the future.

Secondly, IBM Watson was truly amazing. Certainly a breakthrough in technology. But the human beings standing there, that did pretty well against the massive machine, were still, themselves, rather incredible. Humans, in essence, are still mighty powerful. The Jeopardy show had to be filmed on a special set built in the IBM Research Facility, because the computer system comprising Watson took up a whole room and was too massive to move. Whereas Ken and Brad simply walked into where ever they were needed and did their thing. Mind you, computer systems in the 1960’s and 1970’s took whole rooms – and their capability would now be eclipsed by an iPad or small notebook computer. Twenty years from now, Watson will definitely be in the palm of one’s hand (in one form or another).

No Comments

Python under Windows Correct Registry Entries

Even if one installs Python using the .MSI installers, it appears that the correct Windows Registry keys are not set.  See, for instance,–Failed-installation.-setuptools-0.6c11.win32-py2.6.exe-td26716013.html for further information.   Joakim Löw wrote a Python program to update the Windows Registry with the correct values, as needed, except that it used the old key, now corrected as per the “old nabble” page above.

The attached Python program is a slightly updated version of the original to set the registry keys correctly.


No Comments

Moving a WordPress site from one URL to another

If you want to move a WordPress site to a completely new URL, refer to (there are also instructions here: but the former site appears to be better) and perform the following:

  1. First, in the OLD site, do the following:
    1. Login to your site as wp-admin
    2. Go to the Administration > Settings > General panel.
    3. In the box for WordPress address (URI): change the address to the new location of your main WordPress core files (ie enter the NEW address in here)
    4. In the box for Blog address (URI): change the address to the new location, which should match the WordPress address (URI) (ie enter the NEW address in here)
    5. Click Save Settings.
  2. Now, copy the old site to the new site, using something like:
    1. cp -a <old-site> <new-site>
      1. PS, obviously have to have a shell into the server directory where the sites are located, or do it via FTP as appropriate
  3. Now, you should be able to use <new-site>
    1. Can optionally delete <old-site> or keep it as a backup

The other method outlined at the top of does work as well, but it sets the Admin screen in WordPress such that one can not change the URLs internally (one must edit the wp-config.php file).
Edit the wp-config.php file which is in the root directory of the WordPress installation, by adding the following two lines (anywhere, but suggest towards the top of the file):


No Comments