Friday, 20 November 2009

Open knowledge and the role of public funding

I read with interest that Rupert Murdoch intends limiting Google access to its news pages. It shows that Murdoch has no concept of the need for freely moving knowledge to increase human productivity but wishes instead to make a more successful business model for his company. He can do as he wishes however (no matter how much may will disapprove), as the content of his news pages are fully in his control. What if knowledge is created by public funds, should we charge for this? I argue that any research funded by the public purse should be free of intellectual property rights (IPR) as it is a public good paid for by the public. This is called open source knowledge and has a key benefit, anyone can apply it free of charge widening the impact of the generated knowledge. Economists have argued for many years that barriers in market places cause market imbalances that are detrimental to ideal consumption patterns. Usually the imbalance will create unfair pricing. Surely if knowledge produced within a university environment, paid for by tax payers, it must be made freely available to those that want it? I worry about the development of the need for universities to encourage IPR in academic research. Universities are here to expand the body of knowledge and encourage its use in the wider community via a number of channels. Restrict its use via stringent IPR policies and it will be used less. This undoubtedly will reduce the impact academics would hope for in their research and stifle further innovation. It is fair that private organisations that generate knowledge or processes that their IPR be protected. Protection of IPR in this case increases innovation as private investors are made confident that investment is protected underlaw. However, by encouraging universities to be guided into profit making IPR research, the fundamental way that knowledge generates changes. Academics for the most part are not financially orientated but do what they do to maximise the benefit to society. Add a profit motive to this activity and the role becomes one of generating profit on your research and suddenly the knowledge growth model falters. Why am I writing this in a technology blog? The Lancaster Centre for e-Science produces knowledge in the form of software and papers about technology. Every item we produce is governed by open-source attribution lisences. This means that everything we produce is given to the world for free use. This widens our societal impact to the maximum possible by taking away legal and economic cost barriers. This approach does not mean research spin off companies cannot be formed successfully. The academic skill base is huge, and it's one thing creating knowledge and giving it way and then developing it into a business model. It takes skill to do this and this skill needs to be paid for.

Tuesday, 17 November 2009

Sakai 3 is media and embedded tools

As we become accustomed to the new ways of social media, software must evolve that captures its vibrancy. Students, staff and clients expect this of software providers and the organisations that operate it. Sakai e-research and e-learning software is presently in version 2. Next year comes Sakai 3 (see which heralds a new way of thinking in the way we attach ourselves as individuals to secure online environments. Firstly, existing tool boxes (Moodle, Sakai 2) compartmentalize the tools. Social media, such as Facebook, have shown the value of embedding tools into styled web pages; although applications are limited and information security is weak.

Sakai 3 already manages site security very well (as did its predecessor, version 2) and implementing new social media ways of working is going to be a big hit. It will change the way we manage online relationships with our colleagues, students and clients for sure.

In Sakai 2, each tool sits separately with the worksite you use. So, as a user you may want to encourage your research or teaching group to vote on an important matter of the day either by sending an Announcement or engaging communication via Forums. To complete the vote, however, the user must navigate away from the communication point (e.g. forum) to select the vote tool and then make the vote. Sakai 3 overcomes this limitation and allows users to embed different tools into, say forums, much like a photo in the text document. No technical skill will be needed (e.g. html) but will be drag and drop in operation. What this means is that less clicks are required to navigate around the site simplifying the user engagement process. This new way of working also opens the door to better knowledge management practice. When a vote is cast in Sakai 3, the result would also appear in the forum itself, meaning that knowledge on complete array of activites will be available in one place and be searchable.

Over the coming months the Lancaster Center for e-Science will be installing Sakai 3 as a demonstrator for all existing and new Sakai users to take a look at. In the mean time, if you are interested in learning more about Sakai 3 go to Go to

Sunday, 18 October 2009

US Universities making software for themselves

I mentioned in earlier blogs the ability of universities to manage their own software environments, from administration of services to full software development. This may seem like pie in the sky to most commentators but the reality is quite different. We are aware of open source content management systems and learning environments (e.g.Joomla, Sakai); but what about software than assists on the operational side of things? Check out this web site for a project with solid Ivy League backing Big university names working together to build university software solutions that are open source and therefore adaptable to individual universities. This is an excellent concept and something universities around the world should benefit from in the future.

Lancaster Centre for e-Science now on Twitter

Follow the Lancaster Centre for e-Science on Twitter. You can find us at

Wednesday, 7 October 2009

The Lancaster Centre for e-Science is working with the Northwest Regional Development Agency LEAD project that aims to provide leadership training to owners of small businesses in the region. The LEAD programme has been very successful (see and was originally conceived by the Institute of Entrepreneurship and Enterprise Development (IEED) at Lancaster University Management School to provide owners an opportunity to meet other business owners in a mix of seminar and online interactions. Research on past participants on the programme indicates that those which participate are likely to increase sales turnover by approximately 15%; participants also highlight this additional growth is significantly influenced by their learnings during LEAD . Over the coming 3 years the programme, across 15 institutions in the region, will impact around 1250 business owners hopefully yielding sizable increases in north west economic growth.

The question to pose at this stage is how can we keep busy business people online once their LEAD programmes complete? It would be incredibly useful to have a large, easily accessible group of business people. For instance, at the beginning of the recession information on the impact of the recession at the small business level was scarce. Typically, knowledge of impacts has to wait until the dust settles and aggregate statistics come out of the Office of National Statistics. If we engage with business people electronically, in a trust based environment it could provide us real-time information on events. The Sakai portal technology we produce and provide as a service has capabilities as a e-community building device. It contains the usual tools, probably the most popular are forums. e-Facilitators (people that maintain electronic communication) are able to develop trust in in their LEAD communities allowing them to ask fairly confidential information to delegates. Responses to trusted e-facilitators are usually rapid and from multiple sources. Responses form very useful qualitative insights into problems. Additionally to forums, delegates are highly likely to response to online surveys that can be posted via the portal. For instance, we gather survey data from delegates as part of the evaluation of the programme; we have no problem getting to 100% response rates which is incredibly high. The evidence does suggest a high value should be placed in developing and maintaining groups of business owners online.

Collaborative Research in Business

Research has begun that evaluates the knowledge exchange interface between public sector and enterprise. The project, funded by JISC, evaluates how varieties of staff at universities communicate with business people as part of their job. The final ambition of the survey looks at how web based technologies can be used to improve 2-way knowledge exchange between these groups. Based on the Sakai portal framework, we will embed semantic search tools that will allow people to search for documentation and other people within a secure cloud environment. Importantly, once people have found what they are looking for, it will provide them the ability to link up with people using secure worksites allowing easy access to communication tools and information.

To begin the process we are currently surveying all university staff at Lancaster University that communicate with business in any way (e.g. consultancy, advice, research, teaching, student projects). Want to participate in the survey or want to learn more about the research? Click on the following link or cut and paste it into your browser to take part.

Thursday, 7 May 2009

When innovation stalls.....

Software evolves at an alarming rate; canny providers add functionality without changing users work practice excessively. What do I mean by this? Up until very recently, most word processors looked very similar. Most had similar rows of buttons, in the same place, that do the same thing. Then...enter Microsoft Office 2007! I produce documents approximately 50% of my work time. I began word processing about 14 years ago and have become very used to the familiar layout of the Office tool. Now I find that I need to re-learn how to use the Office tools that I once took for granted. Since a recent upgrade to Office 2007 my productive rate has almost halved due to the process of having to re-learn adding/making diagrams, finding hidden components that were once visible and updating diagrams that no longer work in the new format. Is it acceptable for large scale organisations (i.e. Microsoft) for force changes in practice in this way? We have undergone a change in work practice without consultation and without discussion of the implication of the change. We now live in a world where we fully understand that the free market can and does fail consumers (i.e. the credit crunch). As it presently stands organisations like Microsoft have full veto to produce software as they see fit based on (presumably) their market research findings. What this implies is that software is less likely to be focused around 'the need', more likely around the 'profit'. I make the assumption that Microsoft felt the need to differentiate its product in a remarkable way. Office 2003, looks like OpenOffice (a free opensource office package) which in turn looks like Google Docs. For businesses the central need is to maintain profitability; this isn’t fundamentally wrong but in key economic areas it is questionable whether we want ‘for profit’ agencies to have full control of developmental resources. The question I raise is whether these changes in practice are acceptable, and whether greater central control would be beneficial for key computer technologies like operating systems and productive software like word processors. Surely, change would then come about because it is needed rather than to maintain profitability for private shareholders? After all, there doesn't appear to be much new functionality in Office 2007. This concept probably sounds very left wing, but it isn't for this reason. At the Lancaster Centre for e-Science, and it is the same for many software producing academic departments around the world, our ideas are vetted centrally via a peer review process that assesses the need for the ideas we propose. Initially we have an idea, say, to produce a new software tool based on a perceived need. We gather support from potentially interested organisations then form a consortium that would form the project team should the funding be awarded. We gain support from local organisations (e.g. regional agencies, business etc) then submit the bid to a central government funding agency. It is then reviewed anonymously by experts in the field. If they see that our argument contains flaws, or that the work is not sufficiently novel (e.g. new look, not new functionality), then they would reasonably reject our proposal. This process is not perfect but it works reasonably well. Surely we now understand the absolute need of computers in our society sufficiently NOT to let major players force technologies upon us that really don't do anything more than earlier versions yet change work practice significantly? For example, would the review process I highlight have allowed Microsoft to launch the Vista operating system as early as it did? Microsoft, by its own admission, launched a product that didn't do much more than XP, yet, it required new machines to be purchased as it was terribly memory hungry; a clear cost to society. Maybe I am wrong! Maybe this process we observe is that of the incumbent firm finally entering the final phase of its ultimate demise. As Rome and IBM fell from their respective pedestals’ surely will Microsoft. Maybe this is the process that will allow newer, absolutely novel approaches in computing to take over (Schumpeterian logic?). It would be crucial therefore, not to embrace Microsoft too tightly, and seek something a little more innovative. If Microsoft’s innovation cycle is almost finished, surely those that use its technologies will be less able to innovate also.

Saturday, 4 April 2009

Synchronous communication in e-Research

A day of navel gazing, that is blog writing, helps reflection on specific issues of the day. My earlier blogs have explored the necessity of bringing together e-learning and e-research under one roof. The reason for this is quite clear. The process of learning and research are not that different as its all about discovery, hence software tools required to conduct these activities are likely to be similar, to a point at least. This week saw members of the Lancaster Centre for e-Science revisiting a tool set that we created some years back but due to funding constraints, was placed on the back burner. Agora (see brings synchronous, or real-time, communication into the Sakai collaboration and learning environment. As with most virtual research environments (uPortal, Sakai) or learning environments (Sakai, Moodle) communication occurs in text format via forums, chat rooms or announcements. Agora brings voice, video and real time data share into the equation by providing researchers and learners access to cloud based web conferencing. This technology is about 12 months away from deployment within the Sakai portal and should unleash a new level of learning/research ability. All users need to access this technology is a web connected computer, microphone and web cam, and the web cam is optional for pure VoIP communication. As with all of our technology, no software needs to be installed on the computer itself. The question we are now focusing on is what functionality, based around the Agora tool set, should now be developed? Lancaster e-Science has a sizable user base, and this resource is used to guide technology development. To understand user needs we regularly poll our user base on what the technology should do from their perspective. For instance, we developed a new Sakai forum tool, and its creator (Adrian Fish) regularly meets with users to look at extensions to its functionality; this way of working has been very successful. Our position is that too much 'navel gazing' fails to meet the need of users. Developers, and this is not a criticism as it's the way keen minded individuals work, will always produce what they see as a need yet the final user, with much lower ICT knowledge may need a complete redesign of the final tool if they are to use it successfully. To overcome this issue, my role is to formally evaluate the interface of the technology in various situations (e.g. the business/university interface) and how it should be developed to enhance knowledge exchange activities based on user evaluations. We are now starting a project to look at the Agora web conferencing facility in this way. Yesterday, I met with people at the Institute of Entrepreneurship and Enterprise Development (IEED). This organisation conducts research on enterprise development and provides courses to enterprise and business professionals to enhance business activity (Google LEAD Programme Lancaster). Impacting around 1300 enterprises it faces a very typical problem, how to engage with enterprise and maintain 2 way communication (i.e. the interface)? The IEED uses the Sakai portal to leverage communication. I asked how valuable would the web conferencing tool be in this environment? The response was fairly clear, in a world where green issues are becoming 'the issue' web conferencing is a natural next step; particularly so if integrated into an existing framework (i.e. Sakai). Now that the Agora web tool exists, it can be regenerated for different uses. For example, would it be useful to have a video blog? A tool where you can record a 2 minute snippet of footage via your webcam, it is then automatically stored online and published as you need. It isn't a major step for the Agora tool to be redeveloped in this way. And what of additional functionality? Hmmm.... we have plans but you'll need to keep and eye on us for the results of the research.

Friday, 30 January 2009

The dangers of Web 2.0 for business?

It is hardly surprising that many businesses struggle to make effective use of the internet. The article on the beeb about the World Economic Forum below highlights some key aspects of their inability to do so.

I'm not certain using Web 2.0 technologies is really the right platform for, say, strategic business planning or idea generation and I hope this isn't what Davos people describe as 'businesses struggling with the web'. It is far too easy to have critical secrets whizzing about strangers screens in an uncontrolled way. I understand from Wikinomics that this is "a" new way forward (i.e. social networking for business activity), but it seems to me to be far too chaotic to be healthy as long term and secure business activity. Sure, to have an online place for customers to discuss products is a good thing (BT are quite advanced in this respect). If managed correctly it becomes a source of nearly free market research. BUT in the more chaotic WWW, what if an organisation becomes a target of subversive activity? Much like a denial of service limits connectivity of customers to a firms website, the targeted provision of mis-information could be just as damaging. I have no evidence of this occuring presently, but I can vision it happening.

Friday, 23 January 2009

The future IS super computing and the cloud

Web based technologies, whether called Cloud, portals or virtual research environments should provide us with fantastic opportunities for growth and development. My earlier post highlighted something of what lays ahead of us, but what are the opportunities. My own recent interactions with even the simplest technologies have deepened my understanding of how people can benefit from web based communication. As a statistical researcher I see a future where standard statistical analysis programs (e.g. SPSS, WSTATA) are embedded into portals as standard. This would be a big step forward for a number of reasons. Firstly, my present pet hate with SPSS (for example) is that I have to reinstall the package at least once a year due to upgrades or new lisence issues. In the portal world, this ceases to be needed. Providing the university has paid the annual fee to SPSS, and providing that I am a registered university employee or student, I would be able to access its functionality from any web browser. All upgrades or lisence updates would be handled centrally by university computing services. A much greater benefit to users and how they innovate would be the ability of the stats packages to embrace the 'Cloud' to handle GRID enabled multiprocessor computation.

Datasets (i.e the information that we collect on any matter we choose) have become larger often running into terescale dimensions. Our ability to conduct useful estimations on data of this size is greatly deminished. It is no longer uncommon to hear of very large corporations having difficulty processing data for this reason; limiting their ability to take advantage of the latest estimation processes, slowing innovation. A scientific example of this scale of data generation is provided by the Cern laboratory that will generate terabytes of data per experient. On a singe computer it is practically impossible to run models or run tests that would 'sift' this data to enhance knowledge. The cloud provides us a 'super computing' platform to reduce this issue as all data storage and computations take place away from the users machine, harnessing many computers simultaneously, dramatically reducing computational time for many user groups. This is multiprocessor computation. The issue however, is that many organisations may not allow their data to be stored and processed outside of their own IT networks due to data security risks; something we hear rumblings of already...for academic researchers this is likely to much less of an issue.

Move Right

Wednesday, 21 January 2009

Future proofing university e-infrastructure?

The vision of the national e-Science programme and JISC is to develop and provide future proof IT platforms for universities. The question this discussion raises is which direction universities should take now that they face technological choices? The new way of thinking (from a JISC perspective) is that universities need an open standards platform or range of 'embeddable' open standard platforms that will support many university operations (e-learning, e-research, admin) that can be tailored to each institution or be applied using a generic tool stack depending on need. An example of this technology is the Sakai platform (there are others, e.g. Pluto). Although it is an observably advanced e-learning platform it can also be used to embed research tools developed by researchers as it offers open standards they can use to 'hang' their tools in (e.g. terascale computing tools, embedded model estimators, new forms of database handling/management, alumni management, student enrollment management, accounting). This provides a huge amount of flexibility. What this means is that a researcher can view online experiments and teaching resources under a single point of sign on from any internet enabled computer. The issue presently is that whatever the platform the university chooses now will determine how flexible we can be in, say, 5 or 10 years. For example, choosing Microsoft solutions will imply that researchers will be tied into the Microsoft tool stack in the future; and it would be very difficult to tie in bespoke tools that are now becoming common place in academia into the Microsoft closed source framework.

Much has been said about the potential of Moodle as a VLE, but there is no discussion of the development of research tools within this framework; yet teaching AND research is in need of support simultaneously. A great reservation that I have regarding Moodle (personally speaking and would like to hear your views) is that very little investment is coming in at its base, implying a reduced rate of innovation relative to other platforms. If the university were to choose Moodle for e-Learning, it would mean that independent e-learning and e-research solutions would be needed in the future which is far less efficient (e.g. two sets of programming skills, two servers, two databases, two sign on points, difficulty in transferring common data between the platforms etc.). Ideally, the university should have a Director of e-learning AND e-Research (one person, not two) so that issues of functionality and simplification across the work fields can be addressed simultaneously in an unbiased way.

Key to this discussion is that universities should have a core of 'skill set' simultaneously adept at both e-learning and e-research deployment and tool design that can advise researchers on these matters. To hand this to an external supplier misses the point about what Web 2.0 is all about (i.e. the flexibility to design content and tools to meet the need) as suppliers will find it very difficult to keep abreast of high performance research needs and cannot be expected to have in-house, bespoke, research focused software development skills. Once 'digitally native' researchers come online in a few years, this is what they will want from IT and we need to be in a position to hand it to them when they arrive. Therefore, we need to make the right platform choices now to provide them what they will need in their future. To make the wrong platform choice risks damaging the potential of our future researchers and therefore research outputs.

I look forward to your comments.

Tuesday, 6 January 2009

Thoughts on Portal Technology

During presentations that I have made over recent months I straw polled audiences on whether they had heard of 'cloud' technology. Very few have, which isn't really surprising. The cloud is the latest thing in internet development. Last year Google provided us first insight into cloud technologies when it launched Google Apps (word processor, spreadsheet and presentation package). In their essence, they provide applications that you use on your computer (like Office applications, stats packages, diaries) and provide them online. For the consumer, this means a simpification of the way we use computers is firmly on the horizon. By placing applications away from your computer, you'll never need to install software again, or worry about updates. All of this will be handled by the service provider. It also means that computer technology can be simplified as you'll only need a web browser to run these tools. This market is now mobilising quickly. Just yesterday Microsoft announced that it will launch its Office package online during 2009, and that there will be a free version we'll get access to at the price of a few installed ads. This is a major step forward (Microsoft free?) that will set the rules of next generation technology. Cloud developments highlight that the information age is only just beginning to dawn. Old internet (call it Web 1.0) provided us access to information, a great innovation back in 1995. New internet (Web 2.0) allows users to communicate among themselves in lots of different ways, and to rate other peoples web content easily. Now cloud generation technologies are coming we need to focus on new risks and opportunities that will arise. For instance, the Cloud unleashes increased secuity risk to users as their valuable data will be stored away from their computer. Although I would personally trust Google and MIcrosoft to provide great security, it'll take others a little longer to become comfortable with this new way of thinking. Full diffusion will take time, but it will occur. In any case, for organisations requiring more secure solutions more local cloud (cloudlets or patchy fog, what shall we call them?) providers will exist to meet their need, and what of the opportunities? Life is going to get much easier for users (hurrah), for small businesses web technologies will be provided that meet their needs flexibly and cheaply. They will have access to technologies that only large firms can afford to install presently. Of course, there is much more to the risks and opportunities of this technology than meets the eye. Look out for my next post as I unwrap the discussion.