Or how do we escape the ‘experts‘ in the echo chamber? Inspired by @jeffjarvis, whose recent post on TEDxNYed: This is bullshit got me thinking about this whole ‘expertise‘ thing again.
Iconoclasts are the people who tear down the idols of faith. Traditionally this has been a religious activity, but the growth of a secular society has seen the development of secular idols of faith. And social computing has already developed many of the trappings of a religion, with its own priesthood and idols.
But one of the big learnings of recent times is that experts don’t always have all the answers and that we can learn a great deal from engaging in sharing of knowledge for general benefit.
Admittedly, in some cases, only an expert will do. Some examples: if I’m having brain surgery a group of opinionated and gifted amateurs is not who I want on the case; nor do I want my accountant or lawyer to be inexpert.
But in the case of emerging applications for social computing there are not really any experts. There are people who know enough to give a perspective of the technology, the affordances of that technology, and possibilities inherent in it. But once that is out of the way there is a lot more value in shared discourse than in monologue.
I often facilitate sessions with educators and we discuss how social computing is changing the landscape for both teachers and students. And I always come away from those sessions humbled by the amount that I learn. Not because these people know more. Rather it is because they are inquiring and asking questions. It is in the questions and attempts at solving real world problems that we uncover new approaches.
Real people sharing experiences, prompting new ideas and the connecting of dots drives experimentation and adoption of new ideas and new ways of doing things in social computing. This is no clearer than in the various coffee mornings (e.g. NSCM) around Sydney, where people sit and talk over coffee. They share ideas and experience and many come away energised and buzzing with new ideas to try.
But missing from the equation in social computing (or what some people call social media or new media) are the people who are willing to identify the secular sacred cows and call bullshit.
Too many of us are sitting at the feet of the experts (or gurus, ninjas, rockstars, gods and goddesses) waiting for them to deliver the answers from on high (possibly on the new HP tablets if not stone tablets).
Perhaps it’s time for some more social media iconoclasts?
Social media and social networking do not reduce the need for good social skills. Rather, the disconnection from physical presence in online communication makes social skills (what some call EQ) even more critical.
Some of the recent fracas rebounding across Twitter are a good example of this – covered well by various people including @kimota and @mUmBRELLA.
The basic skills for building relationships include reciprocity, negotiation ability and sharing. Also critical are the skills of walking away gracefully from an issue or staying to fight with dignity.
For many people these are skills that were learned in the playground. But what happens when people have missed these important lessons?
What happens if the person who’s been asked to run your firm’s social media activities never developed those skills in the playground? And what are the essential skills required for effective social interaction?
It seems to me that we’ve been putting up with a paucity of social skills in the workplace for a long time and it is only now that there is traceable evidence we’ve noticed that it’s a problem. Social media merely provides us with documentary evidence of the kinds of human social interactions that have been happening for aeons. The problem is that this documentary evidence now gives these unfortunate social interactions a much longer lifespan than a cranky comment in passing conversation.
Evidently on a quick shot medium like Twitter it is easy for a grumpy day or lack of coffee combined with quick fingers to lead to an explosive incident for your brand. Then the Streisand Effect can amplify the incident so that it resonates for days or weeks afterward. And, as an added benefit, the whole thing will get indexed by search engines and be findable for ages.
Social media is now providing us with tangible evidence of how many people lack (or fail to demonstrate) the basic skills required to get along well in the playground. And these are the same skills we need to work successfully with other grown-ups, both online and offline.
Goleman, one of the gurus of emotional intelligence, offers twelve questions to assess emotional intelligence. Answer ‘yes’ to half or more, (and if others who know you agree with the self-rating) then you are apparently doing okay.
The real question is how can we apply this to social media and learn how to channel the best of ourselves rather than the worst?
Goleman’s 12 Questions
Do you understand both your strengths and weaknesses?
Can you be depended on to take care of every detail? Do you hate to let things slide?
Are you comfortable with change and open to novel ideas?
Are you motivated by the satisfaction of meeting your own standards of excellence?
Can you stay optimistic when things go wrong?
Can you see things from another person’s point of view and sense what matters most to that person?
Do you let customers’ needs determine how you serve them?
Do you enjoy helping co-workers develop their skills?
Can you read office politics accurately?
Are you able to find “win-win” solutions in negotiations and conflicts?
Are you the kind of person other people want on a team? Do you enjoy collaborating with others?
Are you usually persuasive?
[Source: Goleman, Daniel. “Working Smart.” USA Weekend, October 2-4, 1998, pp. 4-5.]
Spoke during the last session of the day at Media 140 Perth about realtime web and how it might evolve into an internet of connected people and things. Our evolution towards a networked and hyperconnected society is under way.
The slides might be somewhat opaque without the commentary but please feel free to ping me with any questions.
Crowdsourcing is very trendy these days and is touted as the answer to many of the ills of poor design and the need to reduce costs. In these cash strapped days any way to make innovation better-cheaper-faster is extremely desirable.
But crowdsourcing is just one of the many tools we have at our disposal, and each tool is suited to particular kinds of applications. To simply adopt an idea like this without considering its suitability to the problem domain or to the desired results can be risky.
To assist with critical thinking about crowdsourcing I have collected a few alternative viewpoints & list five reasons why it might not always be the best approach to adopt. Please note I do not agree with everything in the articles linked below – they are meant as a thought starter & to provide different perspectives on crowdsourcing (i.e. if you’ve got any issues with the articles please contact the author directly).
Since no single tool is the answer in all cases, here are a few times when crowdsourcing might not be the right solution:
1. When the crowd does not have sufficient understanding or knowledge
For crowdsourcing to work you need to find the right crowd. If the technical or scientific knowledge required is rare then crowdsourcing might not be helpful unless you can find a crowd of people with the requisite foundational knowledge.
2. Where the problem is diffuse and complex
Crowdsourcing lends itself to solving clearly focused problems where there is little ambiguity or nuance – a great recent example of this was the DARPA balloon challenge.
For diffuse and complex problems it might be necessary to chunk up the challenge (if that is possible). And for problems that require painstaking layering of knowledge and information with long term focus it might not be commercially viable.
A good example of this is the discovery of longitude via crowdsourcing in the 18th century. It worked in the long run, but it took a really long time and was funded by the government. However, it might be argued that this kind of discovery would be much quicker today with computer power.
3. When you want to keep your plans secret
Clearly secrecy requires that only a few people know the secret. Thus crowdsourcing something that is meant to be a secret is probably a bad idea (unless you are executing a cunning hide in plain sight sort of plan).
4. Your problem needs to be compelling enough for contributors to care
Experience of Wikipedia indicates that people will contribute to things that are interesting to them. Thus if nobody cares about solving your problem then crowdsourcing might not be the answer.
There is also a well known report by Forrester about Social Technographics that segments the participation of people within social networks. It shows that only a small proportion of people create or share content, a few active creators or editors, with the bulk of people lurking or not participating at all.
5. Crowdsourcing for complex problems requires dedicated resources
To undertake the kind of knowledge work required to solve complex problems contributors need uninterrupted time in the zone.
This is exemplified in some of the large open source software projects where companies pay people to work full time on open source projects for commercial advantage:
Many of the leaders of key projects (like Guido van Rossum, the inventor of Python, who works at Google (nasdaq: GOOG – news – people )) are paid by their employers to continue to lead their projects. Is there an open source community? Of course there is. But on the most prominent projects, the members of the community have jobs and are paid to work on open source because the software is so beneficial to their employers, even though it is not owned by them. True, there are hybrid models, and the smaller the project, the more likely it is unfunded. But when it becomes a big deal, open source becomes commercial.
Many businesses as well as individuals now see it as normal to have a Twitter account, Facebook page, YouTube channel, a website and/or a blog.
It is also the year that MySpace lost out to Facebook – by focusing on eyeballs and advertising rather than ease of user interaction it marginalised itself.
Facebook went from strength to strength by year end picking up 700,000 new users per day; ending the year as the de facto social network for both geeks and non-geeks. Key to the growth and success of these social platforms among the mainstream population is ease of use, ease of connection with others, and ease of sharing. It is much easier for something to go viral when it is easy for ordinary people to share it.
Facebook is a clear winner on each of these criteria, while Twitter has had a slower adoption as the how-to is not as evident to the new user. However, Twitter is winning the day as the home of buzz and breaking news.
Another example of the mainstreaming of social media is the way it is now an integral part of traditional media such as newspapers, radio and television. Most newspaper sites now enable readers to share content on various social networking sites and to comment on the site.
Many television and radio stations supplement their regular content with additional content such as video, podcast and forums. The BBC has asked its viewers to provide video and image content, while other newspaper sites actively solicit reader photographs or videos for use on their sites.
To the chagrin of the traditional media power brokers much of the innovation in social media is coming from the public broadcasters – for example the BBC in the UK and the ABC in Australia.
Each of these has embraced podcasts, time shift video and active involvement on Twitter and other social networks. This has led to some lively debates between traditional media owners like the powerful Murdoch family and the public broadcasters (see Murdoch attack on ‘dominant’ BBC).
And it, in turn, has drawn spirited responses from the public broadcasters, for example: Mark Scott of the ABC in Australia in The Fall of Rome: Media after Empire One thing is clear from the events of 2009, the landscape and revenue models for traditional media have shifted and the industry is faced with threats to its very survival.
There have been many and varied responses to the shift in the traditional media landscape. Rupert Murdoch deciding to take his News Corporation content out of Google or put it behind a paywall; myriad local newspapers in the US closing down; and ordinary people not caring much at all as they continue to obtain good quality information from various online sources. One thing is certain; there has been a huge shift in purchasing patterns for traditional media. Newspaper sales are down as are free to air television audiences with associated reductions in advertising revenue for proprietors.
There have been some interesting responses to this shift in the traditional media landscape; including the Media140 series of conferences (please note I was live blogging the Sydney event). This is the brainchild of Ande Gregson and his grand plan is to have conferences around the world in 140 days.
Media140 is focused on bringing together practitioners from journalism, politics, advertising; new media and entertainment to consider how the real-time web is changing the way we communicate, socialise and do business. New business models will evolve to take advantage of the social media and real-time web. Their evolution will be driven by the conversations and business ventures that occur during this time of shift in the media industry.
In 2009 the familiar media world we knew from the past century shifted. The age of real-time, social, computer driven news and communications is upon us. It is powered by web 2.0 platforms and funded by emerging business models. Old empires are trembling and new ones are being born. We are in for an interesting time in 2010 as all of these trends continue and we get a glimpse of the winners and losers in this shift.
Of these I will concentrate on social computing and the next generation internet as they are driving a lot of change that is impacting on the education sector.
But probably the biggest change over the past thirty years is the rate of change. Once it was completely acceptable to wait a week for a letter to arrive, to ponder one’s response for a few days and then write and dispatch a letter by post. Then the fax machine changed all of that. Now we receive emails immediately followed by a phone call asking why we have not yet responded.
The pace of change is increasing and has increased substantially over the past 30 years. Look at the mobile phone as an example of this. From the time the telephone was invented until the mid-1980s it remained recognisably the same device. Now, to a person who last saw a telephone in 1980, the iPhone or SmartPhone would not even seem to be in the same family of devices. And, indeed they are not. The modern mobile phone is really converged computing, telecommunications and entertainment device. They even have more memory than my first server.
The next thing to consider is the revolution of the internet. Originally conceived as a bulwark against nuclear war and as a way for academic researchers to communicate it has reshaped the world. Now many people use the internet every day as an integral part of their lives – for sending email, chatting online, shopping, entertainment and business.
Along with this growth in the pragmatic use of the internet, social networks are also becoming mainstream; with Pew Research from 2009 showing 46% of US adults have used a social network at least once, and 27% used one yesterday.
This area of social computing has been the real area of growth and the data clearly shows how social computing is changing how ordinary people share, communicate and interact.
Some examples of these changes include:
In the past email and search engine internet traffic exceeded that of social networks. However, in December 2009 search traffic and social network traffic approached parity in Australia for the first time.
Also previously in late 2007, social network traffic surpassed that from email in the UK for the first time.
And adult website traffic was also overtaken by social networking traffic for the first time in late 2008.
The important thing to note here is that the behaviours of searching, sending emails or checking out p~rn did not change. What changed is the location in which it happens. Thus if you are in Facebook and so are all of your friends it simply does not make sense to leave the application to use another email client.
There has also been development of niche networks for different interest groups. For business there are LinkedIn and Plaxo (amongst many others) and Facebook is winning the war as the de facto social network for everyone else.
Another interesting characteristic of this landscape is that ordinary people are creating and participating online in ways that were once unthinkable. Without specialised technical assistance people are creating videos to share on YouTube or Viddler; they are creating blogs on WordPress, Blogger or Typepad; they are sharing photos on Facebook or Flickr. Remixing music or visual materials is rife –questions of provenance and copyright remain unanswered. Video downloads, online shopping, banking and travel arrangements are becoming the norm.
Against this backdrop various researchers have mapped the generations:
GI Generation aged 73+
Silent Generation aged 64-72
Older Boomers aged 55-63
Younger Boomer aged 45-54
Gen X aged 33-44
Gen Y aged 18-32
And, while the notion of dividing up the population on the basis of age cohorts is useful for analytical purposes, it ignores some simple facts about people. In each age cohort is a bell curve for change adoption – with some members as early adopters, the mass as early & late majority, followed by the laggards. I fundamentally disagree with the idea that mere membership of an age cohort determines a person’s relationship to technology or propensity to adopt change. Rather the determining factor will become one’s willingness to be connected.
This willingness and desire to be hyperconnected via technology will become the new generation gap. A great example of this is the loose confederation of people who meetup on Thursday mornings on the northside of Sydney for coffee. Most of them met originally on Twitter, decided that they liked each other and thought it would be good to catch up informally for coffee.
What has happened is that this has created a vibrant group of people who know each other in real life now. Business ideas are exchanged, family and social tips are shared and other connections are made and broadened. More can be seen at their Posterous site at www.nscm.posterous.com .There are now many similar groups all around Australia – I have attended them in Perth and Brisbane.
What is interesting here is that online and offline activities are blurring and the boundaries between public and private are no longer clear. The conflicts between the connected and the unconnected are already being seen in schools, colleges and workplaces around the world. Just try asking members of your class to turn off their mobile phones to test this hypothesis.
The social implications for all of this are astounding. They reverberate across all areas of life from business to education to socialising.
This technology and the way it is being used now is creating massive interconnections between people and enabling the creation of groups and communities. This kind of community building and collaboration is similar to that we experienced when living in smaller villages rather than in large cities.
But think on this – the children of today will stay in loose contact with every group of people the meet throughout their lives from kindergarten onwards. It is going to be a challenge to manage over a lifetime. The only way to manage these masses of loose connections is by chunking them up into niches. This is where richer technologies that enable this to happen seamlessly based on use rather than manually based on effort.
Another feature of this interconnected world we live in is that we no longer need to wait. Delayed gratification is becoming a thing of the past in many respects. For example in the area of entertainment we used to wait for a movie to come out or wait until our favourite television show was broadcast. But now with the advent of decent broadband and streaming video there is no more waiting. Anyone can watch what they want when they want. And they do exactly that, as anyone with teenagers in the house with a broadband connection knows all too well.
However, against the backdrop of this explosion in connections, information and entertainment at our fingertips we remain unreconstructed human beings. This remains similar to our cave dwelling days.
We still retain our tribal brains that work best in small groups the size of a basketball team. Our brains are wired to deal with small chunks of information – like the magic number seven, which is the number of items we can retain in our short-term memory.
Also we are constrained in our ability to handle a great many close relationships. Many cite Dunbar’s number which is the supposed cognitive limit to the number of individuals with whom any one person can maintain stable social relationships: the kind of relationships that go with knowing who each person is and how each person relates socially to every other person.
Imagine how many contacts you would have if everyone you had ever met since kindergarten was a friend on Facebook. This is precisely what is happening to our young people today.
This means that we need to chunk up all of those massive networks we collect so as to manage them over time. It also means that we are maintaining increasingly loose ties with larger numbers of people.
Ultimately we are social creatures and want to create social networks either on or offline. I often use the example of Facebook, where ordinary people of all ages are routinely creating affiliation groups. These online groups are even creating real life relationships – for example the Twitter community in Sydney often meets up physically with most of us having met online originally.
Another element to the mix is the amount of information we are required to process everyday – email, news, social networks, entertainment, etc. We can no longer store all of this information in our heads.
This is not merely a gratuitous picture of Brad Pitt. It harks back to a time in the past when our societies used epic poetry to store and transmit important information, but now it is all in nearline or online storage. For example, many of us no longer recall the phone numbers of our nearest and dearest since they are stored so handily in our mobile phones.
Also the question of how we are going to retrieve a lot of that information in the future is open to question. I’ve got a floppy disk at home with some interesting photos of a data centre I built a few years ago, but no longer have any technology to access that information.
So where does all of this put us as educators? There are some who talk of a nirvana where all students are self directed learners and we are coaches and facilitators. But I suspect that those people have not met some of my students.
Let’s look back to the web 2.0 meme map from O’Reilly’s Foo Camp a few years ago. It clearly talks about all of the things that have become part of social computing (and this includes social media and social networking).
The following is an overview of the Web 2.0 Design Patterns which will impact upon the future (in general and for education):
The Long Tail Small sites make up the bulk of the internet’s content; narrow niches make up the bulk of internet’s the possible applications. Therefore:Leverage customer-self service and algorithmic data management to reach out to the entire web, to the edges and not just the center, to the long tail and not just the head.
Data is the Next Intel Inside Applications are increasingly data-driven. Therefore: For competitive advantage, seek to own a unique, hard-to-recreate source of data.
Users Add Value The key to competitive advantage in internet applications is the extent to which users add their own data to that which you provide. Therefore: Don’t restrict your “architecture of participation” to software development. Involve your users both implicitly and explicitly in adding value to your application.
Network Effects by Default Only a small percentage of users will go to the trouble of adding value to your application. Therefore: Set inclusive defaults for aggregating user data as a side-effect of their use of the application.
Some Rights Reserved Intellectual property protection limits re-use and prevents experimentation. Therefore: When benefits come from collective adoption, not private restriction, make sure that barriers to adoption are low. Follow existing standards, and use licenses with as few restrictions as possible. Design for “hackability” and “remixability.”
The Perpetual BetaWhen devices and programs are connected to the internet, applications are no longer software artifacts, they are ongoing services. Therefore: Don’t package up new features into monolithic releases, but instead add them on a regular basis as part of the normal user experience. Engage your users as real-time testers, and instrument the service so that you know how people use the new features.
Cooperate, Don’t Control Web 2.0 applications are built of a network of cooperating data services. Therefore: Offer web services interfaces and content syndication, and re-use the data services of others. Support lightweight programming models that allow for loosely-coupled systems.
Software Above the Level of a Single Device The PC is no longer the only access device for internet applications, and applications that are limited to a single device are less valuable than those that are connected. Therefore: Design your application from the get-go to integrate services across handheld devices, PCs, and internet servers.
The social web has developed a set of values based on that original web 2.0 meme map and this Wordle map shows some of those enacted in social computing at present.
But teaching has its own longstanding set of values. And today we are seeing a conflict between those two sets of values in classrooms and lecture halls around the world.
But first a few comments on the nature of these new tools. These tools are a great enabler for minority groups. It levels the playing field for them in many ways. However, it is well to note, as Grady Booch once said: “a fool with a tool is still a fool”.
Our learning institutions are sometimes slow to change and adapt to new ways. On the other hand teachers are often the ones in vanguard embracing change and pushing the boundaries. The institutions of learning in this country are pretty conservative and slow to adopt new fangled technology, usually quite sensibly on the basis of cost. But now with web 2.0 social computing and open source the main arguments against new technology adoption are being destroyed.
Individual teachers are embracing change, but sometimes when I meet these visionary folks they seem more like revolutionary cells of the vanguard than part of the institutional mainstream. But the learners will eventually force our hands by disengaging if we do not respond to the shifts in their cultural practices.
This leads into another area of contention, that of boundaries. These new tools are creating disputes about the appropriate times and places where it is appropriate to use the technology (for example, have you ever tried to get a Gen Y class to turn off their mobile phones?). Also questions about the content and authority of information created or shared. Think about the endless discussions about plagiarism and the appropriateness of Wikipedia as a research authority.
We are dealing with a radically different set of expectations – from our staff, administrators and students (or consumers). Many of these people were socialised in the old non-digital world; while others are digital natives.
As part of my preparation for this session I’ve been trying to distil my thoughts on the implications of new technology on culture and learning. And for me it has all come down to sensemaking as the purpose of education. Dan Russell provides a nice definition of sensemaking: “Sensemaking is in many ways a search for the right organization or the right way to represent what you know about a topic. It’s data collection, analysis, organization and performing the task.”
To a certain extent I think that these changes mean we need to become co-participants in the learning experience. Become facilitators of the process rather than the experts. This does not mean that our experience or empirical knowledge is not valuable. We need to establish mutual respect and open dialogue. And luckily now we have the technological tools to facilitate that dialogue.
It is going to be an interesting balancing act between those different sets of expectations. Defining boundaries in a hyperconnected world is a challenge, but it is worth remembering that interesting discoveries are made at the boundaries of the currently known world. Some of the tools to help with this sense-making process are to embrace the values of web 2.0 as part of classroom practice.
But the challenges to the authority of the teacher and of the institution are not only coming from students and society in general. They are also coming from competitors.
By this I mean the institutions that are subverting traditional ideas of the university or college and putting their intellectual property out online for free. The institutions doing this include the august (e.g. Stanford, MIT) as well as the ambitious (e.g. USQ) as Lifehacker so kindly lists.
Other challenges are coming because of the radical transparency that the web enables. Here I’m thinking of things like Rate My Teacher and Rate My Professor. No more hiding from bad appraisals by students it’s all out in the open now. But looking on the bright side it’s happening to kittens as well.
All of this brings us tremendous opportunities as both a society and as educators. It seems like we’re not in control any more. But I do question if the control we once had was merely an illusion. And I wonder if this new world might not be a healthier one for all of us?
The biggest shift is that we are dealing with connected individuals who are at the centre of a web of networks enabled and mediated by technology. This will give rise to power shifts that we will need to live through and embrace in order to survive.
Note: all data mentioned above is detailed in my slides here
What Is Web 2.0.: Design Patterns and Business Models for the Next Generation of Software, by Tim O’Reilly, 09/30/2005
I really enjoyed the opportunity to present to the TAFE teachers of the Western Sydney Insititute recently about social computing and its implications for education. Slides follow and more detailed notes will be posted shortly.
I had the pleasure of running into Jim Shomos the other night & he was telling me about his latest project – Mordy Koots.
This project is amazing in the way that it brings together so many of the threads of film, gaming, web and social computing. Lots of the ideas that people have discussed, such as the shifting consumption patterns for new media, are realised in this project.
Mordy Koots uses a different approach to telling a story. There are 10 x 3 minute action packed episodes delivered via web and mobile in partnership with NineMSN. It stars the very funny & endearing Shane Jacobsen (of Kenny fame) and is directed by Clayton Jacobsen.
This has not been launched yet, but Jim kindly gave me permission to use the clip. Check it out.
I suspect that this is a glimpse into the future of entertainment led by some Aussie ingenuity and the constraints of making feature films in smaller markets.
I always like to keep up with what Dion Hinchcliffe’s thinking and recently he’s been talking about How the Web OS has begun to reshape IT and business, and particularly about how businesses are driving the change almost by accident, in spite of the IT department.
These days in the halls of IT departments around the world there is a growing realization that the next wave of outsourcing, things like cloud computing and crowdsourcing, are going to require responses that will forever change the trajectory of their current relationship with the business, or finally cause them to be relegated as a primarily administrative, keep-the-lights-on function.
What Dion describes really aligns with what I’m seeing in lots of companies and their IT departments. For many IT departments there seems to be a feeling of “if we just ignore it, ban it, or block it then it will all go away”.
The issue of what I tend to refer to as the shadow IT department is beginning to loom large. This shadow department offers many of the IT department’s capabilities, but they are accessible by ordinary business users outside of the normal IT and procurement channels.
Once upon a time the IT department were the custodians of technology. Selection, implementation of new systems and access to them was like joining a mystery cult. New users were indoctrinated into special language and special ways of making things work. The IT department staff were the high priests of the cult and they controlled access very strictly.
All this was reinforced by the high cost and complexity of IT systems.
But now technology has undergone a revolution. And it is a revolution akin to those of the Russians back in 1917. We are living through a sudden change in accessibility of technology. With web 2.0 and social computing ordinary users now have access to the same kind of technology that was once the province of the high priests of the IT department.
Everything you need is at your fingertips, for example:
Each of those examples is readily available to the average person who can use a web browser & who has a credit card. No more seeking the advice (even if it might help) of the IT specialist. Just notice the need and get a solution right away.
Spoke at the Bunbury ACS Chapter the other night (g’day to @Moist & @nezzle) – about the future of computing and the impact of social computing. We had a really interesting discussion about privacy (and the death thereof as I have been prone to argue) and the possible futures arising from the social computing revolution.
My friend, Mark Pesce, has written extensively on the impact that the kind of hyperconnectivity enabled by the internet and social computing will have on education, business and politics. He covers many similar ideas to mine. And it would be foolish of me to do otherwise than direct people to his most excellent thoughts on the topic. Check out some of Mark’s ideas on his blog, or in particular here and there.
But one topic came up last night that is both interesting and important. One participant mentioned that perhaps her children & future generations would understand all of this new technology and its implications much better. She also posited that coming generations would understand the technology better than we do.
It was a great thought starter for me. Because I’ve argued for a long time that what we are doing with web 2.0 and social computing is abstracting away from end users the complexity inherent in technology.
Until now anyone who wanted to create a software artifact – web page, upload content such as images, video or audio – needed to acquire a reasonable amount of technical knowledge.
To create a web page one needed to know basis HTML. To upload the webpage and associated content to a URL one needed to know how to use FTP either via command line or client.
Now one can simply join up to Facebook or MySpace and, without any technical knowledge beyond use of a keyboard, mouse and web browser, upload and share textual, video, audio and visual imagery.
Web 2.0 and social computing have democratised the use of technology so successfully that it is now a utility akin to a light switch.
Most of us have no idea what goes on behind the light switches that we use everyday. The web and its applications are becoming similar utlitities.
An interesting question that follows on from this is – if web becomes a utility will we stop thinking about it very much?
Will we stop considering social, cultural and political issues that surround it and merely accept it in the same way we take electric light for granted?
By abstracting the complexity inherent in web applications and content away from end users are we making it easy for the technology to be used to constrain our behaviour, beliefs and actions?
And, most importantly of all, who is going to create the future applications if everyone just accepts the technology as a utility? We’ve already got a similar problem in the West with electrical engineers. How are we going to keep up the supply of people who know how to create software to ensure that we don’t end as a world of “middlemen” who only know how to use but not to create technology.
Some very interesting questions raised by the people I met in Bunbury.