Category Archives: Stories

Prospects in Data Science – A multidisciplinary symposium

For three days in January some of the key thinkers in the field of data science met together in the elegant and inspiring surroundings of the New Forest just outside Southampton to reflect on the current state of play in Data Science.

Prospects in Data Science – A multidisciplinary symposium

The event took place over 12th – 14th January, 2016 at the Cerys Manor conference centre and hotel in Brockenhurst in the heart of the new Forest.

This was a high-level event on Data Science organised by the University of Southampton which aimed to look at the Prospects for Data Science, understood as the interface between Statistics, Mathematics and Computer Science providing new methods for handling, analysing and extracting knowledge from data, including Big Data. There was a good turnout for the event over all of the days.

IMAG0996

Anne Trefethen, Chief Information Officer, Pro Vice Chancellor, and Professor of Scientific Computing at the University of Oxford presenting her perspective on data science

Outline programme:

Day 1 (Tuesday 12) 

  • David Hand Imperial College London The roles of models in data science
  • Peter Grindrod University of Oxford   Red Herrings and Wild Goose Chases – Creating Analytics for Impact
  • Wendy Hall    University of Southampton   Observatories and data analytics for Web Science
  • Frank Wood University of Oxford     Probabilistic Programming
  • John Aston University of Cambridge
  • Fai Cheng Lloyd’s Register     Big Data and its Transformational Effects

Day 2 (Wednesday 13)

  • Gunnar Carlsson  Stanford University
  • Vitaly Kurlin Durham University  Topological ComputerVision
  • Jane Naylor Office of National Statistics  Data science and its role within official statistics
  • Anne Trefethen  University of Oxford
  • Sofia Olhede
  • Posters and networking

Day 3 (Thursday 14)

  • Arthur Gretton  University College London  Kernel nonparametric tests of homogeneity, independence and multi-variable interaction
  • Dave Coplin Microsoft UK  The Rise of the Humans: How to Outsmart the Digital Deluge
  • Yike Guo  Imperial College

David Hand, Imperial College London, gave an excellent opening talk which set the scene for the rest of the conference. Data Science and Big Data have been around for a while, understood by mathematicians, but only now are they receiving greater interest from the public, media and politicians.

Peter Grindrod, University of Oxford, presented a provocative reflexion on the challenges and opportunities involved in setting up a Data Science programme. In other words, an examination of the big, high-level issues that have to be tackled in order to achieve a successful educational outcome. One of the key aspects of this is the identification that strong leadership skills are needed at all levels in clouding at the funding bodies themselves.

Overall, there were, to my mind at least, a number of key messages that emerged from the talks and debates that look place over these three days. There is something of fight, or at least competition, to own data science. This is problematic as data science is something of a skill set or approach that spans traditional disciplines which still need to thrive. There has not yet emerged an agreed understanding of what data science is and what skills are needed by practitioners. It is also clear that data science is a fast moving and evolving field. Finally, the call to action for senior policy-makers to grasp this topic and understand the need to provide some subtle steer on the trajectory and velocity of change.

Internet of Things and Food: ITaaU/FSA programme outcomes

The Internet of Things offers great potential benefits for health and well-being in many areas, not least in how we manage the production, transportation and storage of the food that we eat. Understanding how we can and should benefit from the mix of technology and process requires that a number of key sectors share thoughts, knowledge and vision. We see this discussion as needing input from policy-makers, agencies, academics and industry. We welcome participants from such areas to learn about the work of our recently funded pilot projects and to contribute to the debate on future direction of this exciting field.

This event brings together researchers from the recent programme of pilot projects run by the Food Standards Agency and the RCUK-funded IT as a Utility Network+. Key outcomes including the benefits and potential for IoT in improving security across the food chain or network will be presented. The role of data, both open and closed, will figure strongly in these conclusions. The event is targeted at researchers and policy makers in the fields of IoT, food, food security, food transportation and storage, and also wider environmental issues.

The event has been structured to present detailed discussions on Monday with an overview and more strategic discussion on Tuesday. So that policy makers might wish to attend only day 2, and academics and other experts both days or even just day 1. (do let us know if only intersted in partial attendance.)

Event details for the 7-8 March, London – further information

Food Standards agency (FSA)/ITaaU IoT projects workshop, 18 January

This event is designed to welcome funded project members to the University of Southampton and introduce them to the ITaaU and FSA teams.

In addition, we will be preparing for the larger event that we are planning in London on the 7-8 March. This meeting, which will be held in Westminster, will act as a showcase for the programme and present the outcomes of the project to a wider audience of specialists, industry figures and policy-makers.

The programme for the day is as follows:

10:30 – 11:00 – registration and coffee

11:00 – 11:30 – welcome and intro to ITaaU and the FSA

11:30 – 12:30 – intro to the 4 projects

12:30 – 13:15 – sandwich lunch in the meeting room

13:15 – 14:30 – project plans, needs and opportunities

14:30 – 15:15 – planning for the London meeting – 7-8 March

15:15 – 15:30 – wrap-up

EPSRC-funded CommNet II announced

CommNet II: UK Research Strategy Community Organisation in Communications, Mobile Computing and Networking within the EPSRC ICT Portfolio

On the 25th November Professor Timothy O’Farrell said: we are pleased to announce the launch of the new EPSRC sponsored “UK Research Strategy Community Organisation in Communications, Mobile Computing and Networking within the EPSRC ICT Portfolio” referred to as CommNet2.

Our new website is live at http://www.commnet.ac.uk/ and you are invited to register now for information on events, news and networking activities. Please explore the site features, in particular our use of “Groups” to support community initiatives. We are planning a kick-off event in Q1 2016 shortly followed by our first challenge workshop where topics on “5G and IoT” will be discussed.

Building on its predecessor CommNet, CommNet2 aims to bring together the UK academic community engaged in ICT research in order to identify, discuss and address the major ICT challenges of the future. CommNet2’s Management Board includes Professor Timothy O’Farrell (University of Sheffield), Professor David Hutchison (University of Lancaster), Professor Gerard Parr (University of Ulster), Professor Dimitra Simeonidou (University of Bristol) and Professor Rahim Tafazolli (University of Surrey). More UK academics and Industry Partners are also members of the Advisory Board, full details available in our website.

The challenges addressed by CommNet2 necessitate a cross-disciplinary approach to research. Hence, the network will bring together academics from the electrical engineering, electronics, communications, networking, mathematics and computer science disciplines. However, the network is accessible to anyone interested in contributing to research and innovation in ICT, including both early career researchers and ICT professionals as well as established academics. Industry is also welcome to participate in our network activities.

Building on the EPSRC ICT Theme Priorities, CommNet2 will deliver a three year programme that includes networking and training events; international research-horizon scanning; and best practice challenge workshops and conferences. Specifically, the network will create a forum to enable coordinated discussions leading to the formulation of future research bids and world leading publications.

Please share this information with your colleagues who may also be interested in participating in the network and encourage them to register. Users can select the amount of information they wish to receive.

We look forward to welcoming you to our kick-off event. Please contact info@commnet.ac.uk for further information, to ask questions or make any recommendations on the activities and conduct of the network.

Timothy O’Farrell

Professor of Wireless Communications

Director CommNet2

University of Sheffield

The changing face of official record keeping in UK, Ireland and the Netherlands

Further observations from the Threats to Openness in the Digital World conference, Newcastle 24-25 November, 2015

Extreme weather metaphors were flying around at Northumbria University in Newcastle this week where the Threats to Openness conference was looking at the future of the public record in the digital age. According to the speakers, we’re facing a digital deluge in the coming decade as a result of a perfect storm of technological, political, legal and economic changes.

The fate of government records were at the heart of the discussions due to two major developments that are hitting archivists simultaneously: over a 10 year transition period that began in 2013 and involves releasing two years of records each year, records now have to be transferred to the National Archive after 20 years rather than the previous 30, and, soon, those sets of records will also be the first to be predominately digital. So, in 2016/17, digital records from the early 1990s – not least those dealing with decisions about wars in Iraq and Afghanistan – are due to be processed.

IMAG0766

This presents challenges at all levels of the transfer process, as David Willcox, digital sensitivity review lead at the National Archives, outlined. Digital records bring increases in both volume and complexity.

“We do not have dockets and files any more. We have blizzards of emails stored on different computers – a morass of stuff. The best thing you can say about it is that it’s data,” explained Tim Gollins, of the National Records of Scotland.

Two thirds of government data is held on shared drives, confirmed David Willcox. Email accounts for 50-70% of content, with one government department revealing that it had an impressive 190TB of email data.

This makes appraisal – the appreciation of the value and historical relevance of a record – more difficult. Even more challenging is sensitivity review – the process by which records are checked for compliance with data protection laws and any risks to national security, damage to international or business relations, personal information and so on. This review determines whether a record is retained by the department, sent to the National Archives as a “closed” record or opened to the public. According to Arthur Lucas, member of the Lord Chancellor’s Advisory Council, 75% of the documents going into the National Archives are closed, although he pointed out that transferring a document closed is not a way to “bury” a record and it is preferable to it being retained as at least it is indexed and its status can change in the future. Sensitivity review is currently done by humans, page by page, and can be a lengthy process. And it presents a real conundrum in the digital age: the nature of sensitivity review is inherently tricky for computers but sheer volume means that bringing in technology may be the only option.

Technology can certainly assist: eDiscovery tools can be used to apply categorisation or clustering to unstructured information; software can help highlight themes, events and people (which may help with reducing duplication – there is around 40% duplication in government records); 75% of exemptions from leading government departments relate to personal information and this is a good starting point for technological solutions as it should be easier for software to highlight easily identifiable fields such as names and addresses.

Research into this is crucial, and it is underway. Michael Moss, professor of mathematics and information science at Northumbria, and Tim Gollins have been working on an IT as a Utility Network-funded project to look at methods and algorithms that will enable the creation of useful tools. The CIA is working with the University of Texas on tools for CIA records. There is interesting work ongoing at Colombia University (they are “approaching record keeping from a completely different perspective,” commented Moss. “They have reconceptualised the archive from ‘a whole collection of documents’ to ‘data you can analyse’.”)

Why does this matter? “Transparency and openness,” said Willcox. “Good governance,” said Sir Alex Allen, former permanent secretary at the Ministry of Justice and Cabinet Office and author of the Records Review report into the readiness of government to move from the 30 year to 20 year rule. “Good record management is not just about preserving the historical record. It’s also important for the efficient running of the office.” When civil servants are asked for advice on a particular policy issue and they know that they looked at that same issue a few years ago, they need to be able to find the papers that relate to the discussions and decisions. It is also important for audit and accountability – how do you know if a private sector company contracted to do public sector work is fulfilling its contract if you can’t find the paperwork? – and for provision of evidence to public enquiries and legal proceedings.

A cautionary tale of what happens when it goes wrong was provided by Mary Daly, president of the Royal Irish Academy. She related the woeful story of how a key government decision was made during the Irish banking crisis in 2008. Or, rather, how we don’t know how that key decision was made because the records relating to it are “inadequate. In fact, non-existent. There was a complete lack of proper procedures.”

But the public record – and the huge changes it is undergoing in the digital age – is not restricted to government. According to Jeremy Frey, professor of physical chemistry at the University of Southampton, the scientific record, which is – or should be – also a public record when it is publicly funded, is also evolving.

“We are in a liminal period. Publication is a ritual we all go through but what we’re really about now is moving from paper publication paradigm to a digital one that will allow much more.”

However, there is currently a gap between the opportunities offered by digital and, in many cases, how they are being exploited – or not.

“Scientific papers have become a repository for the argument but most of the data is missing. In the past absolutely everything was in the paper, because it could be. Scale was not a problem. That has changed,” argued Frey. If researchers do not see and value the data then they cannot be sure that they can trust that value chain…and lack of trust in the data can destroy the scientific endeavour. As well as more intelligently accessible data (a scanned copy of data in the form of an image cannot be usefully searched) there needs to be more detail on methods and, to be disseminated effectively, it needs to have a narrative: “the story that you weave around the data that is as important as the data itself.”

For the humanities and social sciences, there is a challenge rearing from a different direction. David Erdos, lecturer in law and open society at the University of Cambridge, gave a rich rundown of the current state of the EU data protection landscape, how it is set to change and why it should concern humanities and social science scholars.

Currently, derogations within the directive allow some wriggle room for particular, special purposes, notably journalism, literature and the arts. The EU is now proposing that these are contained in a “middle area” covering knowledge facilitation more generally. One area of concern for researchers is that there would be no derogation from the proactive duty to provide privacy notices if the purpose of the data use changes. While biomedical research organisations have been busy lobbying about this, Erdos said that such activism needs to extend to the social sciences and humanities research community. “I have been trying to make them aware of it and their obligations around data protection,” he said. “The whole landscape is very confused around this and research ethics and policies. Seemingly, there is very little understanding of the implications of legislation. The community needs to fight for this – that’s what the press and journalists do.” It is likely that research will end up being an area for which a huge amount of discretion is passed to member states and, so, working on a national level will be as important as the European level.

Agnes Jonker, senior lecturer in Archives at the Archiefschool, (the Netherlands Institute for Archival Education and Research), University of Amsterdam, gave an insight into how the Dutch treat access to the public record, and how it compares to the UK. Most notably, the Netherlands’ first FOI law was drafted in 1980 (in the UK the FOI act is from 2005) and so there is, generally, a more relaxed air around the concept. Which is not to say that it is without its critics – a new FOI law was proposed in 2013 (though it will not be in force for a few years) to update legislation in the light of changes to the state: with the shrinking of government through increased privatisation, third parties are escaping FOI scrutiny. However, there is no reference in the proposed new law about the duty to document.

Coming a full circle back to public records and the humanities, Andrew Hoskins is a military historian who is concerned that current developments will render uncertain the record of warfare. He is particularly worried about the reduction from the 30 year rule to 20 years will result in more records being closed. “The buffer protects those potentially subject to embarrassment or danger. But buffering time is under pressure. From 2013 to 2023, two years of records will be processed every year without a doubling of resources. It’s punching holes in the records in demand by historians. It’s not a recipe for careful selection and preservation,” he said.

Hoskins, like Frey, thinks that the “story” is crucial – and risks being lost with the move to digital files. “It shifts away the context that comes from the handling of the physical file. Without the material context, the front and behind of each file, the information might be found but the story might be lost,” he said. “A history of warfare that depends on the official records of the British army has an uncertain future. Over the 2000s we’ve seen a perfect storm of technological, economic and political change at all points from collection to collation and archiving and assessment for declassification through to their being made public by the archives – and some of these pressures result from the culture of openness that has attached itself to current technological changes without adequate resources and understanding of the issues. It’s not a recipe for improved public access. Faster history is not necessarily better history,” he warned.

What’s the answer? The conference concluded by considering potential ways of moving forward and the actions, partnerships and collaborations required for that to happen. In the words of David Thomas of Northumbria University, the ideal is to “take the archival idea and reinvent it in a new context of record making and record keeping in a new social world.” Watch this space.

Written by Michelle Pauli

 

 

Threats to Openness in the Digital World

Threats to Openness in the Digital World – 24th – 25th November, 2015

The complex and sometimes contradictary challenges presented by the role of official records in the digital age were eloquently presented and rigorously discussed at a 2-day conference in Newcastle this week. Within the imposing and inspiring ambiance of the Great Hall in the Sutherland Building at Northumbria University, a large audience of interested parties relished the opportunity that this event presented. ITaaU was pleased to act as co-sponsor along with the Joseph Rowntree Charitable Trust. The event continued a line of collaboration between ITaaU and the organisers in the form of work on sensitivity review as records, soon to be increasingly digital, become due for release to the public domain. A more substantial report on the event will follow shortly.

IMAG0774

Professor Mary Daly, President of the Royal Irish Academy Digital archives, gave an illuminating and enthralling talk on the role of official records in recording decision-making during times of great turmoil, specifically in relation to some recent Irish commissions and inquiries – including the various Banking Inquiries.

IMAG0734

The Great Hall provided a perfect setting for the two-day conference.

IMAG0743

Professor Julie Mcleod, Professor in Records Management in the Department of Mathematics and Information Science iSchool, Northumbria University, opened and closed the conference and ensured that the talks and panel discussions were timely and lively.

Further information can be found on the @Threat2Openness blog site: https://threats2openness.wordpress.com

Interlaced 2015 by Rhyannan Hall

Interlaced 2015 offered a fast paced and multi faceted insight into an inter-disciplinarily arena of creativity. The emergent nature of this fusion of art and engineering was evident through the lack of a shared vocabulary between the delegates, indeed this was highlighted by several as an important area for development if we are to move forwards with wearable technology.

Broadly, wearable technology seems to have developed into two distinct but related formats. The first is wearable technology for the purposes of fashion, this was the focus of the conference. The second is using the technology in medicinal or work related contexts.

The most visually spectacular examples come in the form of light up gowns and jackets. CuteCircuit have patented their own methods of creating garments with so many lights embedded into the fabric that the garment essentially becomes a screen. Moritz Waldemeyer has also created superb work, visually stunning and edgy.

Conor Farringdon gave an excellent talk in which he raised moral and social issues from wearable technology. Garments that are able to collect our personal data leaves us vulnerable to infringements on our privacy, a key area of concern for Helen Oliver who is using her PHD to research ways we can protect our privacy.

The grand finale was a catwalk show of wearable technology, featuring 3D printed fabric, LED dresses, and the work of several recent graduates.

Although exciting, I do have some concerns with regards to moving forwards. Many delegates talked of the fragility of the technology and how this is proving to be incompatible with textiles, and the problems of collaborating when there is a lack of shared vocabulary. With no specialist courses running in the UK currently only a handful of people can understand both sides to this eclectic area of development. The more successful designs had been designed and engineered by the same person who understood both aspects of the problem to create a holistic object.

Rhyannan Hall

Rhyannan is a costumier and has a passion for dye work, pattern cutting and wearable technology. Her work can be found at www.rhyannanhall.co.uk.

Community Conference 2015, Southampton

From 6th – 7th July we celebrated another successful year with a two-day programme of talks, demonstrations and conversations. In addition to lightning talks and demonstrations from many of our funded projects and other related initiatives, we were very pleased to include the following speakers:

  • Jeremy Frey, University of Southampton: From living labs to digital utilities and services
  • Philip Godsiff, Surrey Business School: The role of crypto-currencies: money as a utility?
  • Tracy Keys, RCUK: A research council’s perspective
  • Nick Long, Southampton Solent University: the art of branding and identity in the digital age
  • Anisah Osman Britton, The Bakery Accelerator Programme: Women in innovation
  • Zoe Philpott, Interactive Storyteller: Ada Lovelace, retelling the story
  • David Rew, University Hospital Southampton: The clinical informatics revolution
  • Alan Scrase, University of Southampton: SETsquared’s hight tech startup support partnership
  • Elena Simperl, University of Southampton: Open data and social machines
  • Amanda Smith, Open Data Institute: Open data – creating partnerships, changing business, connecting cultures
  • Andy Stanford-Clark, IBM: What next for the Internet of Things?
  • Paul Watson, Newcastle University: Healthcare in the Cloud
  • Alana Wood, ustwo: Holacracy, time and design studio culture

The conference took place on the 6-7th July in Southampton at the Solent Conference Centre in the City Centre.

Further reports, photos, slides sets and films will appear shortly.

Are we really multidisciplinary? TRIFoRM bridges some UCD gaps

As the TRIFoRM project draws to a close with results about the levels of trust which motivated users are willing to place in the technology they use, we are particularly mindful of one of the original goals of the ITaaU: to encourage and facilitate multidisciplinary interaction and collaboration. In a nutshell, we need to learn to work with each other, understanding different perspectives and views, and exploiting the final outcomes of truly cross-disciplinary insights. In that vein, TRIFoRM has uncovered another layer of complexity. It is not just about the engineers and researchers from ICT as well as the social sciences who work together to understand the problem and towards its successful resolution, but it’s about what the potential beneficiaries understand and how they react.

TRIF0RM-image

“Estirar de la soga” by Macobru – Own work. Licensed under CC BY-SA 4.0 via Wikimedia Commons – https://commons.wikimedia.org/wiki/File:Estirar_de_la_soga.jpg#/media/File:Estirar_de_la_soga.jpg

Looking at trust and trustworthiness in ICT systems, it’s to be expected that HCI experts, technologists (especially in our case from the IoT and sensor networks) and social scientists would all need to collaborate and contribute to understanding the parameters around technology and service adoption. This cross-disciplinary engagement was needed, of course, to identify just what we need to look for in terms of a potential user’s propensity to trust and their behaviours towards technology: and this is one of the huge benefits of the collaborative networks building up around the ITaaU forum. Far less obvious, though, was the collaborative dimensions around the technical service we were investigating.

In TRIFoRM, we engaged with both clinicians and patients involved in the management of chronic pain. The technology component aims to provide monitoring and self-reporting capabilities on a smart phone or similar which would remotely upload data to a central server where those data would be aggregated such that the clinical team from the consultant to the specialist nurse would access to relevant long-term objective data to support and supplement direct patient contact. We should remember at this point that any such service would involve sensitive personal data. That in itself set off alarm bells: for the technologists, would the clients and servers as well as connections between them be robust enough to protect and maintain data integrity and prevent intrusion? For the social scientists, how could potential users be assured that the technology would appropriately manage such data? And for the end users themselves?

For chronic pain sufferers, there were no such issues. Anything which might help them as well as the medical team manage and relieve their condition would be welcome; data security was not an issue. But are they simply naïve to the risks associated with technology? After all, as our clinician pointed out: systems are only as secure as their users. The point is rather more to do with personal relationships. Overwhelmingly, users would trust the technology in the same way that they might trust the medical team. There is however an important caveat: trust transfer of this kind whereby benevolence is perceived in the technology as a result of confidence in the medical team would occur and persist solely if it added to and did not undermine the person-to-person relationships that the patients valued with the clinical team. In an interesting perspective on ease of use and utility, the other common trustworthiness feature associated with benevolence, our participants highlighted that technology might alleviate or compensate for the physical as well as cognitive limitations which their condition imposed.

At the recent ITaaU event in Southampton, the surgeon David Rew made the point that what he and his colleagues wanted was intuitive presentation of the complex patient data they needed to digest to support their patients; turning away to consult a computer screen was not an option. That is one very important factor. Patients themselves however want to support and enhance that relationship through technology too. What TRIFoRM has shown us and our colleagues is the importance of going beyond user centred design to look at the personal and social context of technology deployment and acceptance.

Taking TRIFoRM out to the community

Last week marked a high point in the TRIFoRM project, as we were able to travel to the ITaaU Community Conference to engage with a wealth of other fascinating projects while showing off our work.

In terms of the “showing off” part, we first gave a lightning talk about what TRIFoRM is doing and why, how it relates to other projects such as OPTET, and – last but not least! – what we have achieved. We followed this up with a poster and demo; three of the project investigators were able to talk with conference attendees in much more detail about TRIFoRM’s aims, objectives and achievements. We were, in fact, so busy talking to people during this session that – alas! – we did not manage to get a photograph of our poster or ourselves in action. Don’t take our word for it that the poster looked good, though:

TRIFoRM

As you can see, we had plenty to talk about. Since our previous blog post, we’ve been busy: led by the social scientists in our team, we have assembled a state of the art on trust (considering trust of humans and also technologies), before using that theoretical grounding to guide semi-structured interviews with stakeholders. What stakeholders, you ask? We chose to focus on healthcare monitoring technologies for people suffering from chronic pain, and spoke to people with such issues as well as someone who provides services for these people.

We got some great insights. The two blue diagrams in the bottom left of the poster are thematic maps showing knowledge we gained about technology acceptance and trust transfer, while on the right are diagrams representing two types of threat to trust that we identified from those interviews.

If you want to know more, don’t worry: our third and final blog post will give more detail about our key findings.