Connexica logo

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

As we reach the last remaining weeks of 2016 it’s time to sit back and reflect on the year we have had. 2016 has been quite the year, with Brexit, Trump becoming President, Tim Peake going into space and back, David Bowie dying, Leonardo DiCaprio winning an Oscar, Bob Dylan being awarded a Nobel Prize in Literature and team GB coming second in the medal tables at this year’s Olympic Games. It has been a very unpredictable year!

Things have been busy here at Connexica, so here’s a rundown of what we’ve been up to.

This year we celebrated our 10 year anniversary. Connexica has grown dramatically over the past decade and 2017 is set to be another fantastic year as we continue to grow, develop new solutions and reach out to new customers.

2016 saw our return to EHI Live at The NEC to demonstrate our data solutions to members of Healthcare and Public sectors organisations. The two day exhibition and conference provided a great opportunity to talk all things data and reach out to attendees showing the use of our solutions within Healthcare and other sectors.

This year our developers have been working hard and have forged some fantastic new features including:

Tech Market View awarded Connexica the honour of Little British Battler earlier this year. Each year the award highlights 12 innovative UK organisations and recognises their contributions to the technology world.

Gartner recognise Connexica and our solution CXAIR by placing us on their ‘Other Modern Vendors to Consider’ list. This list positioned us with five other Search Based data Discovery providers all of which are American – it is a great achievement to be recognised on this list and represent the United Kingdom.

This year we attended The Sentinel Business Awards, the evening providing a great opportunity to meet and network with other local organisations sharing a common pride in local businesses, competing with Capula and WoolCool.

Our continued expansion has not faulted this year with recent implementations of CXAIR within various sectors including Banking, Local Government and Healthcare to name a few. These implementations only further highlight the flexibility of CXAIR and its self-service ability which means it is accessible by anyone is everyone industry.

2016 has seen Connexica secure places on various frameworks including Jisc and CDIS. These frameworks allow organisations to easily procure our solutions often speeding up the procurement process which is often faced by public sector organisations

On behalf of everyone here at Connexica we would like to wish you a Merry Christmas and a Prosperous New Year.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Each year we bring you a selection of our predictions for the following year, however this year, we thought we’d be different and look back at the technology predictions we made last year and see which were accurate which and have fallen short.

Prediction 1 – Internet of Things is becoming more practical

It’s fair to say that this prediction was something of a given with the birth and growth of major consumer technology wearables such as the FitBit, Apple Watch and other new entrants to the market in recent years.

Following on the same track, a report by Markets and Markets estimates that the overall market for wearable technology is expected to hit $31.27 billion by 2020, growing at an annual average rate of 17.8% between 2015 and 2020.

While just a segment of the IoT market, wearable technologies are a huge driver to the growth and success of the overall market. This huge growth forecast illustrates the significant increase in the investment we have seen this year and will continue to see over the coming years for IoT devices.

Prediction 2 – Big data gets bigger (Information of Everything)

The previous prediction and this one are very much interlinked. The more successful and prominent Internet of Things becomes, the more devices there are and the more numerous and varied their applications become.

All of this contributes to the continuing increase of the amount of data in the world and in particular will continue to drive high volume, high velocity and highly variable data – big data.

It’s safe to say that this prediction is just getting started this year and will continue on for the next five years.

Prediction 3 – Virtual Assistants – from Clippy to Cortana

Indeed, the development of the virtual assistant is available for all to see. We started off with Clippy, the animated paper clip character in Microsoft Office that had a largely negative response from Office users.

Unfortunately for Microsoft, even though they had the right idea, the technology available at the time or simply, their execution was off. Nowadays, we are spoilt for choice. We already have virtual assistants on our smartphones and other similar devices in the forms of Siri, Cortana and Google Now. The functionality that all of these possess has been upgraded through subsequent updates this year.

Google Now has been phased out and replaced by its evolved form, Google Assistant. Unlike its predecessor, this new virtual assistant is capable of two-way conversations and has been integrated with Google’s latest technology releases; their new home assistant, Google Home and their new standalone smartphone, Google Pixel.

The former is intended to compete with Amazon’s Amazon Echo smart home virtual assistant which has also just been released in the UK this year.

There’s plenty of action and developments in this market and I believe that this prediction has rung true with the promise of further developments and updates in this space in the years to come!

Prediction 4 – The Device Mesh

The device mesh is something that has been occurring naturally over the last few years. It describes a world where all of our devices are able to connect and share information with one another, aka the Internet of Things.

I believe this year, with the further additions of “smart” devices filtering onto the market, has not only led to creating a wider “device mesh” but also improved vendors’ understanding of how to continue developing their products so that the device mesh continues to become more seamlessly integrated.

Prediction 5 – Advanced Machine Learning

It’s fair to say that advanced machine learning hasn’t quite become mainstream in a consumer market as of yet. However, that is not to say it has been found out to be just some redundant buzzword, not at all. I think most already realise the wide range of applications that advanced machine learning technologies can provide.

In fact, big data technologies have certainly benefited from these advanced machine learning concepts with many vendors adapting their technologies to include some sort of predictive modelling component.

We ourselves have taken advantage of machine learning technologies this year in our latest product development within CXAIR – predictive modelling. This includes the capability to build neural networks which ‘predict’ the likelihood of certain outcomes based on historical data.

Prediction 6 – 3D printing

While arguably already a massive trend in 2015, I felt this had to be included because of the advancements made, especially within the medical field. Not only can we now 3D print limbs, but blood vessels as well as Chinese scientists have successfully implanted these within monkeys, marking an important step to 3D print and successfully implant organs into humans.

Prediction 7 – Adaptive Security Architecture

On the day of writing this article, technology giant Yahoo has been the victim of a data breach by a group of hackers. It seems clear that the bigger and better we are able to capture and store data, the more we have to lose. And, it appears as though even the technology giants aren’t invulnerable to being hacked with some high profile attacks taking place over recent years, including Apple, Google and Sony Pictures to name a few.

I think it could be argued there is still a long way to go here until the future technologies of the world are “hack-proof”, if that is even possible. Cybersecurity that is responsive to identify security threats and respond accordingly will be essential as our technology infrastructure becomes larger and more sophisticated.

Prediction 8 – Ambient User Experience

User experience has definitely been a key focus across a range of industries for a number of years now, but with the advancements made in virtual reality and with it becoming a far more mainstream technology, it seems as though user experience is about to hit a new level.

Further innovations in this area include advancements made by UK based technology company Ultrahaptics who have developed a technology using ultrasound to create invisible buttons, dials and tangible interfaces that respond when touched. It is believed the technology has huge implications for virtual reality, household appliances and the automotive industry.

Prediction 9 – Commercial Drone Use

I might have been a bit lucky with this one, it turns out that the Amazon Prime Air service I briefly mentioned last year has successfully completed its first customer delivery on 7th December 2016 – just last week!

Don’t get too excited though, this is just a trial of the service, open to a very limited number of customers, just 2 of them in fact, who perhaps by coincidence happen to own massive gardens and live in close proximity to the depot centre…

While it is obviously a great breakthrough for the team at Amazon, it will still be a while yet until commercial drones really take off the ground (see what I did there?).

 

And that wraps up our 2016 Technology Predictions revisited! I hope you have enjoyed the article. If you want to get up to date on and get plotting for next year, then take a look at our obviously 100% accurate technology predictions for 2017!

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

With 2017 fast approaching, it’s time for our healthcare technology predictions.

Every day we are bombarded by the media highlighting inefficiencies in the NHS, the obesity crisis and an aging population.

How can technology help drive efficiency, reduce the burden of our overworked NHS and improve the lives of the population?

1 – Rise in virtual GPs

The Office of National Statistics have recently reported an increase in the use of the internet to research health related information, with 68% of 25-34 year olds now turning to the internet for advice and information. Combine this with results found in the 2016 GP Patient Survey stating that:

People are now turning to telemedicine, using websites and apps to access medical care and information. Certified doctors are now available at the click of a button to diagnose and give advice for peace of mind, provide electronic prescriptions and in some cases, organise medicine delivery and self-test kits, results and feedback.

Accessing this medical advice does have a cost associated with it but for many this is negligible when compared to the speed and access to a medical professional.

While there are limitations within the telemedicine industry, such as a lack of physical consultations and a reduction in care continuity, 2017 will see a rise in use as patients are increasingly frustrated with waiting times. When patient care is managed effectively, telemedicine providers should alleviate some of the pressure on the NHS.

2 – Data security will be under threat

Unfortunately, 2016 saw Northern Lincolnshire and Goole NHS Foundation Trust experience a cyber-attack that forced the trust to cancel hundreds of planned operations and outpatient appointments.

Experian have released its 2017 data breach industry forecast and it does not look positive for the health sector. The report’s top five predictions state: “Healthcare organisations will be the most targeted sector with new, sophisticated attacks emerging” and it is no secret that personal medical data remains one of the most valuable types of data sought after by hackers.

Healthcare organisations need to be ready for an increase in data breaches – organisations that are unprepared will be at an increased risk without stringent data security measures, contingency plans and fully trained employees.

The report goes on to discuss how “Ransomware presents an easier and safer way for hackers to cash out; given the potential disruption to a company, most organizations will opt to simply pay the ransom,” the report states. “This has unintended consequences of funding more research and development by attackers who will in turn develop more sophisticated and targeted attacks.”

3 – Expansion of technology in medical tools and apps

2017 will see a surge in funding for R&D in the use of technology for medical tools. 2016 has already seen Glaxo Smith Kline team up with Google’s parent company, Alphabet, to develop miniature electronic implants. The partnership will be the first of many, with innovations such as implants able to treat asthma, 3D printed organs and developments in wearable devices changing the way we study our health.

For a long time now healthcare apps have focused mainly on wellness, with over 165,000 health related apps on the market today. Price Water House Coopers Consulting predict that 1.7 billion people are expected to download a mobile health app by 2017, new apps will not only monitor patients but will also diagnose with a variety of ailments and, in some cases, predict when an illness will occur.

4 – Increase in spending

One thing for certain is that 2017 is going to be the year of spending – Gartner predicts that companies worldwide will spend $3.5 trillion on IT in 2017. In a shift toward cloud technology this spend is predicted to be spent on software and services opposed to more traditional hardware. Software spend is projected to be up 6% in 2016 and 7.2% in 2017 to total $357 billion. IT services spending is set to grow 3.9% in 2016 to reach $900 billion, and increase 4.8% in 2017 to reach $943 billion.

5 – One to watch out for

Blockchain technology has been around for a while now, with Bitcoin claiming many headlines this year. 2016 has seen the hype build around blockchain technology, so expect to see 2017 take hype to reality as people move from a proof-of-concept to production and real world use cases especially in the finance and government sector. Look out for blockchain integration and private blockchain networks as the technology is widely adopted.

We hope you have had a successful 2016 and are looking forward to the interesting technology advancements of 2017!

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

While 2016 may be remembered for its unexpected political outcomes, the technology news has also seen its fair share of publicity. A year that will be remembered for the forced ‘upgrade’ to Windows 10, the short-lived phenomenon that was Pokémon Go! and Apple removing the headphone jack from the iPhone, opting to sell no less than seventeen different dongles, has left 2017 as a year that must generate more positive headlines.

With a healthy dose of optimism, here are the top five 2017 technology predictions.

Data-Driven ‘Internet of Things’

Despite initial promise, the ‘Internet of Things’ has not had quite the impact many first envisaged in 2016. While there is a tangible sense that this is a technology that shows a lot of promise for businesses and consumers, the release of failing WIFI kettles and Amazon’s lacklustre Dash buttons have served to only dilute the benefits of a connected home tailored to individual requirements.

However, Gartner maintain that the ‘Internet of Things’ will continue to grow, with a predicted $1 trillion of savings for consumers and businesses while increasing data storage by as little as 3%. While the data will continue to grow, it will expire at an earlier date to ensure any information accessed is as up-to-date as possible, signalling far more robust data-driven outcomes for next year and beyond.

Key figure: 20-30 billion connected devices by 2020

Watch out for: More bad ideas, such as internet connected egg storage.

Droning On

With all new technology there are teething issues that must be overcome on the road to success. For drones, the issue of safety remains at the forefront of any debate. Amazon maintain that drone delivery will soon become a reality, claiming that it will soon be ‘as normal as seeing mail trucks on the road’. This opens up huge possibilities for the future, with faster delivery of key items driven by increasing online shopping habits.

For 2017, drones look to mature as a technology, with a large range already publicly available. As drones become more efficient and advanced over time, the more useful they will become to businesses and the general public alike. The main point is investment – with big company backing, drones look to be the most hyped technology of the near future.

Key figure: Projected value of the drone industry by 2025: $90 billion.

Watch out for: Outlaw sausage deliveries.

Charging Forward

With over 80,000 electric cars already on UK roads today, 2017 looks to only increase on this wallet-friendly method of travel. There are already new releases planned that seek to improve every aspect of travel and with Tesla releasing more affordable vehicles, electric cars look to be far more aspirational than their initial conceptions.

2017 will see Tesla owners having to pay to charge their electric cars having previously received this service for free. While this may come as a surprise for some, it does indicate a huge rise in demand. As charging points increase alongside electric car sales, 2017 will be seen as the turning point for when the industry turned mainstream.

Key figure: Electric vehicles to represent 35% of new car sales by 2040

Watch out for: Need for speed – Acceleration madness

Talk it out

For 2017 and beyond, voice control looks to evolve past providing novelty pre-programmed responses. With the Amazon Echo and Google Home technologies looking to revolutionise hands-free interaction, big names are looking to advance the core technology implementations of ‘voice browsing’ far beyond current capabilities to provide context sensitive assistance applicable to real-world situations in the home and on smartwatches.

For the UK, the delayed release of the Amazon Echo means that 2017 will be the first full year that this technology will be used in everyday household situations, a stern test that will either impress or annoy – only time will tell.

Key figure: 30% of web browsing will be screenless by 2020.

Watch out for: frustrated users shouting at their wrist.

Virtual Reality

Gaming has come a long way since the early days of virtual reality, now providing positively received, immersive experiences on affordable hardware. With more headsets from more companies set for 2017, soon virtual reality will reach far beyond the scope of home entertainment.

The global market size of virtual reality is predicted to more than double in 2017, with online shopping just one of the many predicted experiences set for the future. Soon, leaving the house will be unnecessary to experience window shopping or ski slopes, with headsets that become more realistic and convincing with every new iteration.

Key figure: 58 million users of virtual reality in 2017.

Watch out for: surging number of neck braces

With such a varied range of new and evolving technologies, 2017 looks to innovate where 2016 frustrated. While environmentally friendly cars will pave the way towards a more environmentally friendly methods of travel, it may be virtual reality that stops users leaving home at all. After all, why leave the house when you can order food using voice control and receive drone-delivered packages?

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

I write this having wrapped up another year of discussion at our stand at EHI Live 2016. All of the planning and hard work that has gone into this year’s exhibition certainly feels like it has been worth it now. It’s safe to say it has been another successful year for us at EHI Live this year, perhaps made more monumental than the last with the introduction of our new exhibition stand.

Thank you to all of the visitors that came by to our stand for a chat, it was great to hear your thoughts on tackling the challenges your organisation face. You certainly kept us all very busy!

With another year gone, it feels like an appropriate time to discuss the key takeaways from this year’s EHI Live. Your discussions have in part driven these takeaways as we haven’t been able to visit many of the talks.

Data Governance

Data governance is always a hot topic for discussion at EHI Live and this year proved no different. One of the main objectives for healthcare organisations right now is joining together all of their disparate data into one repository. This brings with it all sorts of data governance challenges.

Unifying all siloed data within an organisation significantly increases the number of people with an interest in accessing it. Data governance controls would have to be much stricter than perhaps previously, ensuring that our confidential personal data is only accessible by those who truly need it to improve our health outcomes.

Interestingly, the end of last year saw the Secretary of State for Health commission a report for the NDG to carry out an intensive review of data governance and recommend new data security standards. In her foreword, the NDG Dame Fiona Caldicott explains that there has been very little positive change in the use of data across health and social care since the 2013 review. In the 2016 review, it was identified that many of the data governance breaches were related to information on paper. As the health and social care sector moves towards a paperless digital future with the right technologies, better data governance protocols should naturally occur.

Interoperability

Interoperability was something we mentioned in our takeaways from last year’s event and this year we found it was no less important! In fact maybe more so, as from the conversations we had with people, this was certainly a key area of interest for them.

Since last year, we have been accredited by techUK’s Interoperability Charter to show that our solutions are interoperable and that we too are also invested in this area to ultimately deliver better integrated health and care.

In an ideal world, all healthcare systems would be interoperable so that they are able to work together. This provides numerous benefits, the most prominent being data is able to be passed across to different systems allowing a joined up view of organisational data which can be used to better solve the challenge of tracking patient journey.

This really needs to be made a key focus by healthcare technology suppliers to ensure the products they are providing the health and care sectors are made interoperable. All systems within healthcare should be part of an integrated community of systems and it is time that we make this a reality.

Shared Digital Vision

This is something that has always stood out to me as someone who is enthusiastic about the great benefits achievable through self-service analytics and a data strategy that allows input from clinicians and other end users. It was certainly something I picked up on as being a key topic of this year’s event.

We have already seen that the current approach is not working and still many healthcare professionals are left isolated from data because of low technology adoption rates perhaps caused by a lack of technology usability and communicating the vision to front-line staff.

I felt this was summed up really well in one of the talks when it was said, “We want technology to work around us, not the other way around.” That short sentence describes exactly what all technology providers should be striving towards, healthcare or not. Designed for the end users in mind to make their jobs easier.

Without involving front-line staff and other end users in the digital vision, healthcare organisations are missing a critical opportunity to build a data-driven culture and design their technology requirements around the teams of end users who will be using it the most.

Big Data

Big data was always going to be a big topic this year. I suppose this is more prevalent within health and social care simply because they are a perfect example of big data generation and the challenges that come with storing, retrieving and analysing information.

Everyone knows the benefits of successfully implementing a big data strategy but the unique challenges of the NHS make it very difficult to do so. As one of the world’s largest organisations, it is unfathomable to imagine all the different systems they use, all generating data of varying size and format.

With the introduction of IoT and other wearable devices in healthcare, it seems that this big data generation is only on course to further increase the challenges to successfully capture and analyse big data. In order to realise a world of predictive analytics, trend forecasting, disease prevention, pre-emptive prescribing, etc. harnessing the insight held within big data is needed.

The key areas for discussion around solving the NHS’ big data challenges revolved around a lack of standardisation of NHS software, the difficulty in maintaining data governance standards and a shortage of required skills needed by staff.

It seems essential that NHS and other health and social care organisations really research the market to ensure they are getting a solution that ticks all of these boxes. As a final note, the role of self-service technology is becoming more and more important, allowing clinicians and other non-technical end users access to applications that require little training, meaning they are able to concentrate on improving their service – delivering high quality care to the patient.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Now technology has infiltrated our everyday lives and smartphones provide us with an ever increasing array of varying information, the internet has become a far more accessible commodity. While it has grown enormously over the past decade, the data is actually easier to navigate than ever before, with seemingly limitless content rapidly available to be digested worldwide in an instant.

This data growth is not limited to just the consumable online content. A shift in attitudes and the advances in data capture technology has resulted in businesses possessing more data than they are currently able to gain meaningful insight from – the ‘big data’ challenge.

While the technology advances that have led to widespread availability of robust storage technology aids consumers, it has developed a real issue for businesses: all of the data is stored but its variety, location and immense size leaves potential insight untapped. Data, after all, is just data without effective analysis.

The data warehousing concept, initially crafted to draw structured transactional data from a range of sources, has transformed the availability of these enormous, separated datasets for businesses, alleviating access issues that slow reporting and subsequent decision making.

As the influx of unstructured data continues to shape the data input, the data warehouse model must adapt to not only offer storage, but analysis of the information-rich assets that, through traditional analysis, remain unavailable for cross-examination alongside structured data.

Before any effective analysis takes place, however, data must first be properly loaded into the warehouse with robust, properly planned rules applied. “You are what you eat” is a mantra all data warehouses should live by, as poor data quality slows reporting and clouds the view of decision makers, negating its primary purpose.

As data is fed into a warehouse, it must first be ‘cleansed’. By allowing this level of control, cleansing rules can automatically flag data that is incorrect, improperly formatted and that does not meet strict field-specific criteria. ‘Bad’ data can therefore be avoided and subsequent reporting can then be based on good data confidence.

This level of analysis requires honing the input data through robust data quality rules. Due to data warehouses pulling information from a variety of sources, differing input methodology usually results in data sets that do not match, with little differences such as the spelling of names (‘John’ or ‘Jon’) effectively duplicating individuals. By having set data quality rules, a report can be generated that flags potential inconsistencies while future algorithms are refined to proactively correct issues as they arise.

Finally, data validation rules can be used to flag and track any records that do not meet a set criteria. By making this data available for reporting, the number of records that do not adhere to set standards can be bought to attention, pinpointing exactly which data source requires further planning to match the rest of the imported data.

Traditional data warehouses have rigid schemas that struggle to keep up with the constant increase in data and the emergence of new unstructured data sources. To combat this, newer technologies encourage proactive development of context-specific rules that greatly increase data quality whilst allowing access to the increasing number of datasets required to make proactive informed decisions.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

In recent years, banks and building societies within the financial sector have had to respond to the vastly changing landscape of the financial world.  As customers drive change from the traditional high street branches to a more modern digital service it is imperative that these organisations do not get left behind.

The number of building societies in the UK has dwindled over the last two decades, dropping from 80 in 1995 to 45 in 2013. Meanwhile, the number of branches has significantly reduced over the same period, going from 5141 to 1604, illustrating the massive shift in consumer demands to a digital service.

The challenge facing organisations within the financial sector is dealing with the huge volume, variety and velocity of data, known as ‘big data’, as their services shift towards the realm of digital.

Influx of Big Data

Historically, banks and building societies have been able to serve their data and reporting needs using basic spreadsheets as a means to get by. With the influx of ‘big data’, this outdated approach is now just not enough and many organisations who have been slow to invest in new digital technologies are in danger of becoming reactive rather than proactive to movements in the market and customer needs.

Furthermore, as reporting takes place in spreadsheets via manual extracts from operational systems, there is a challenge to centralise the data – to obtain a single version of the truth. If these organisations are to truly be successful, they must look at centralising their data silos. As Gartner puts it, ‘think global, but act local’.

This mantra means that the information in the organisation must be centralised and available at a global level, which provides the organisation with the ability to make decisions confidently and more accurately on an individual local level.

Overhauling the digital infrastructure

With the modern digital needs of banks and building societies it is vital that there is an overhaul of digital infrastructure to support big data analytics and establish a data-driven culture. Before any sort of business intelligence implementation takes place, banks and building societies must evaluate their own individual requirements and circumstances to ensure the successful implementation of any solution.

Manual processing and reporting via spreadsheets is an issue reaching far and wide, expending valuable resources that would otherwise be allocated elsewhere. A shift towards business intelligence solutions that are self-service with fast, efficient access would provide a significant return on investment for those relying on legacy systems.

Building societies may also want to look at other benefits of self-service business intelligence solutions that go beyond cost-savings such as fraud analysis. While the cost of fraud prevention systems is typically quite high with ‘traditional’ solutions, self-service analytics allows core functionality to be expanded without the large costs usually associated with this level of data insight.

For banks and building societies to keep up with the fast pace of the changing financial market and the ever-growing quantities of data in the world, they must centralise and then harness the available data to tackle fraud, meet customer demands and make inroads into new markets. Without this, there may continue to be a continuing downward trend of the number of banks and building societies in operation in the UK.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

In recent years, fraud has become a recurring problem for local authorities in the UK. Back in 2013, it was found that as many as one fifth of London council tenancies showed ‘indications of fraud’, and the Annual Fraud Indicator (AFI) concluded that a figure of £52 billion had been taken from the economy due to fraudulent activity.

While it is relatively easy to identify the high incidence rates, it has been significantly more challenging to identify the type of fraud committed. Every year the UK government estimates the percentages of services and benefits that are taken fraudulently, but these are only approximations. In reality, the figures may be much higher.

There is, however, a clear reason why so many instances go undetected. Due to the lack of resources, there is simply not the infrastructure to provide extensive analysis that would uncover such occurrences. It is this analysis that is necessary to discover the inconsistencies, with housing benefit fraud, for example, often discovered by cross-referencing service bills, such as banking or utilities, with housing records. When inconsistencies are present, it can often be the first indicator of fraudulent cases.

While accumulating this data poses no difficulty, it is cross-referencing the separate data sources that poses a problem. Traditionally, local authorities store the collected information in rudimentary databases, with some still opting to use Excel spreadsheets. This makes the process of extracting key data very time consuming, which is further aggravated by the high volumes of data being processed.

By opting to incorporate traditional solutions to make sense of the data, the reliance on trained analysts or the council having to spend a large amount of money training certain members of staff may result in very high operating costs. As local councils must operate cost effectively, neither option is viable for regular analysis.

To combat this, there needs to be a technological shift to democratising business intelligence. This requires an understandable means of interfacing with the masses of data, allowing more members of staff within an organisation to gain actionable insight from self-service business intelligence, negating the long waiting times and overreliance on IT departments usually associated with traditional solutions.

One example is search-powered analytics, where data is available via natural language search rather than complex programming languages. By utilising this technology, anyone familiar with an internet search engine can navigate key data and can unify previously siloed datasets into a single interface, providing key decision makers with a single version of the truth.

This approach allows staff from all areas on an organisation to contribute to a data-driven culture, where decision making is based on sound data confidence. Where local authorities are concerned, this kind of insight could prove vital to recognising trends and making valid adjustments to services to reduce instances of fraud while making substantial cost savings.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

As we start moving into the latter stages of the year, I thought it was about time we revisited a topic that we covered last year in our blog – the NHS winter pressures. A lot has happened in the NHS over the last year: the four hour A&E waiting time targets have been missed, key cancer targets for referral to treatment (RFT) have not been hit for over a year and the NHS 111 phone service is missing its target to answer 95% of calls within 60 seconds. The constant however, is the winter pressures faced by the NHS every year.

The NHS faces more pressure over winter than in the other seasons due to the cold and wet weather increasing the likelihood of injuries and illnesses among patients. Among the particularly vulnerable are the old and frail who need more care and attention. This can become an issue for bed occupancy rates across the NHS as they are unable to release patients who need support from community healthcare teams even though they are medically fit to leave the hospital.

Challenges usually confined to the winter period are now being experienced over the course of the year. The four hour A&E target set by the NHS hasn’t been hit nationally for over two years with admission rates rising at consistent rate year on year. It is predicted that by 2022 17,000 further beds will be needed to cope with the demand.

Interestingly, a research report conducted by QualityWatch found that over a quarter of A&E units missed the four hour A&E target despite being less busy than expected. Additionally, there is a decline in the number of A&E attendances over the winter period.

So what is causing these units to miss their targets? Over winter, the proportion of older people admitted to A&E was found to increase and typically they wait longer in A&E than other, younger patients. The older generation of patients typically have more complex and urgent care needs and thus the time it takes to discharge these patients takes a lot longer than others.

The findings in this report illustrate the need for heightened community care for the elderly. This would ensure that they receive a high-level of care outside of hospital, thus freeing up beds for other more critical patients and reducing the strain on A&E departments, particularly during the winter months.

Using data to solve the challenges

Paramount to the success of this approach is the use of data. The NHS has a wealth of data available to it, generated by various departments and areas within the NHS leaving the data siloed. The challenge facing the NHS over recent years has been the ability to integrate this data to gain important insight, so that decision makers are better prepared to make more informed decisions.

By bringing together data across the different areas of the NHS, including primary, community and secondary care, a better picture of patient journey can be formed. Understanding patients’ full medical records is critical and will enable the analysis of their historical medical data in order to identify patterns leading to hospital admissions.

This information can be used in conjunction with predictive analytics to target those patients most at risk of admission and counter-measures can be made to deliver a proactive care plan to these individuals, particularly the elderly.

As technology advances have been made within business intelligence and big data analytics, it seems as though the NHS has not yet caught up and still uses outdated technologies incapable of solving the challenges faced by modern healthcare institutions.

The power of search

We recommend that the NHS leverages search-powered analytics to solve this challenge, allowing access to up-to-date information from all sources in a single user interface without the technical restrictions traditionally associated with legacy technology. With search technology, the NHS can quickly collate their data feeds into speed-optimised, searchable indexes to help measure and understand risk factors leading to admitted patients.

While it is quite apparent that it is difficult to predict A&E admissions over the winter months, it is clear that action can be taken providing the right technologies are employed with the right vision and mentality. Substantial efficiencies and decisions leading to a resolution of the challenges faced by the NHS can be made through a switch an implementation of search-powered analytics.

For more information or to arrange a personal demonstration of CXAIR please contact info@connexica.com

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Imagine this scenario: you wake up one morning and brew yourself a fresh, pod-based coffee. While basking in the decadence of your velvety beverage you glance back into the cupboard and a harsh reality dawns: you are running low on pods. This is no ordinary coffee. You cannot simply leave the house to restock the cupboards with copious amounts of finely ground Arabica produce as this particular blend is an online exclusive. How can this situation be quickly rectified and feng shui restored?

The answer, of course, is to embrace the Internet of Things (IoT).

Encompassing the latest IoT innovation, Amazon has rolled out its new ordering service, Amazon Dash, to the UK. With the press of a button, items such as toilet roll and coffee pods are electronically ordered and delivered within 24 hours, allowing users to replenish everyday items without leaving the house or visiting a website.

How successful this implementation of absurd convenience will be remains to be seen. While the ironic inconvenience of queuing at a local delivery office due to missing a delivery of dishwasher tablets may prove too much for some, few will doubt Amazon’s ability to influence online shopping trends.

This new IoT device represents a major shift in the way in which devices around the home are connected. When Amazon first launched in 1995, it claimed to be ‘Earth’s biggest bookstore’, now it is leading the revolution against traditional shopping habits and championing ultra-convenient order processes powered by a huge logistical network – all made possible by the IoT.

This is just one instance. Light bulbs, house alarms, speakers, surveillance cameras and central heating monitors are just some examples of the electronic devices given internet access and subsequent remote control. Has the IoT gone OTT?

Some think so. With its relatively fast evolution, many IoT devices are simply not ready for the security risks that the internet poses. The Director at Corero Network Security, Sean Newman, noted how many IoT devices have barely enough processing power to connect and deliver core functionality, with security not considered a relevant factor. This fear was realised when US retailer, Target, were hacked in 2013 through the vulnerabilities of an internet-enabled climate-control system.

The real threat that hackers pose to IoT devices was highlighted in an experiment where a car was remotely controlled, with the exploit allowing remote access of in-car functionality while it was being driven. In the experiment, the car was being driven at 70 mph before the screen wash was activated, the radio turned up to maximum volume and the accelerator pedal disabled – all possible over a remote internet connection. With certain manufacturers opting for masses of features without considering the sizeable security risks, thousands of cars have been shipped with entertainment systems open to remote hacking that could prove fatal.

As the real selling point of the IoT is its widespread integration with a seemingly limitless device list, the risks become more widespread. Bruce Schneier, Chief Technology Officer at Resilient, makes a point that ‘The next president will probably be forced to deal with a large-scale internet disaster that kills multiple people’, conveying the real fears that go hand-in-hand with an increasingly connected, and therefore vulnerable, everyday environment.

When the famously misquoted, likely satirical ‘everything that can be invented has been invented’ was coined, its creator was blissfully unaware of a future where convenient coffee machines and ‘smarter’ appliances were sought-after devices. For the IoT revolution to maintain its current trajectory, however, its wild ambition must be scaled to accommodate the security to match the features available.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

The traditional challenge facing fast-growing organisations and large enterprises is being able to manage all of their business processes and ever-growing number of operational systems. This is when an Enterprise Report Planning (ERP) system such as Microsoft Dynamics NAV is implemented to integrate these systems and lead a more intelligent, streamlined and automated approach to business processes.

In today’s competitive business world it is important to strive to achieve every competitive advantage possible. A great ERP system on its own isn’t enough – more advanced analysis and actionable information is required to support top-level management decisions.

ERP is not enough on its own

While most modern ERP systems contain a reporting module, it’s never as comprehensive as a full business intelligence application. For example, reporting within an ERP solution is useful for providing a visual snapshot of where you are at a certain point in time, but a more comprehensive solution is needed when you want to delve into the real-time information available to your organisation or when you want to isolate contributing factors. For a retail organisation this could be a visual representation of all of the products in their various shops as, at a glance, you can easily identify top performers and the worst performers.

That’s good information, but now what? You need to dig deeper and ask questions of it in order to understand what contributing factors are causing certain products to be top performers and others to be low performers. Is it simply a matter of customer preference or could something else be contributing to performance?

By integrating data from your ERP system and other operational systems with a business intelligence application it gives you full visibility of the organisation and puts you in the driving seat when making decisions.

Drilling down into the data further might reveal that there have been several issues with the quality of a supplier’s products which has in turn impacted your performance on the shop-floor. You may decide to cut your losses with this supplier and look elsewhere for an alternative supplier or stick with your current supplier. The point is, you have the actionable information you need to make a decision.

These are the sorts of detailed questions you need to ask your data in order to create useful analysis. It is possible to get these answers without integrated business intelligence but the process required is much more intensive and thus value is being lost.

For example, if I wanted to find these answers I would have to drop out of the transactional system and rely on a technically trained analyst to build and run a report for me using an OLAP based system and provide me with the information that I need. A truly complete business intelligence and EPR integration provides an environment tying together business process management and advanced ad-hoc analysis, meaning you are able to ask sophisticated questions of your data faster and without reliance on others – that is true business value.

Scope KPI’s and desired outcomes for success

Of course any project like this comes with its risks and challenges despite its massive potential. Paramount to the success of this approach is to ensure that the overarching strategy is well planned out in advance.

KPI’s and desired outputs need to be scoped well in-advance of beginning the project and maintaining flexible requirements is essential to ensure the strategy can become adaptable over time.  The planning needs to be well-communicated so that all involved employees have a very clear idea of how they contribute to the end goal – I would advise reading our previous post on building a data-driven culture for more information.

A typical cause of failure when involving business intelligence is when there isn’t a clear vision of the outcome that the strategy will provide – think ahead about the objectives you are trying to accomplish for a positive end result.

There’s no doubt that with the right strategy, the right technologies and the right mentality that a business intelligence and EPR integration can be a source of huge competitive advantage for businesses – the key is to reveal actionable information with the right level of analysis.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

With numerous operational systems and data input procedures that vary depending on the individual and their department, uncovering causal links between patients and operating theatre efficiency in the NHS is not a straightforward task.

Theatre time is the most costly activity that a hospital delivers. If cancellations can be reduced, huge cost savings can be achieved. However, the latest NHS statistics reveal over 2700 more last-minute theatre cancellations for non-clinical reasons than in the same period for 2014/15, pointing to an increasing problem that must be urgently addressed.

One way to reduce theatre cancellations is to manage other areas of care that act as influencing factors. Unscheduled care, such as the largely unpredictable nature arrivals in the Accident and Emergency (A&E) department, has an immediate knock-on effect on a number of important factors such as bed availability, waiting lists and theatre utilisation.

Such irregularities are notoriously difficult to plan for, especially given the national rise in A&E admissions since the start of the year. One solution is to use predictive analytics, resulting in far more effective warnings given to decision makers in advance. For example, certain distinct variables may, when occurring simultaneously, result in an increased number of admissions. This trend would be noted by the analytics system and subsequently flagged as a time where more proactive planning of resources can be achieved. By using quantitative data from previous real-life scenarios in the same hospital to predict increase in demand, analytics can be used to support effective planning with an aim to better manage the often problematic demands of unscheduled care.

Why is this important?

Without effective planning, unscheduled care can have a profound effect on theatre utilisation. Reduced availability of beds and longer waiting lists results in two distinct outcomes: an extended wait for the patient and missed targets for the Trust.

For the patient, the inconvenience of reorganising all of their travel, work and child-care arrangements is added to the increased emotional stress for a patient who may further deteriorate in the time it takes for the operation to take place.

For the trust, there are immediate financial implications. The average tariff per case is £1,500, with Orthopaedic operations costing between £4,500 and £9,000. Furthermore, should the patient not be re-booked with 28 days, the patient can choose any other hospital for the operation to be performed while the original hospital pays – a double financial blow for a Trust’s finances as they pay for operations they are not carrying out. With the NHS under increased pressure to remain financially viable, these costs represent massive inefficiencies that, with the right planning, have the potential to be vastly reduced.

The impact of predictive analytics cannot be understated. Even applied to only one area of a Trust, such as aiding the management of unscheduled care, bed management is subsequently improved – a factor that often leads to last minute non-clinical cancellations.

The amount of data available to Trusts is as vast as it is varied. Used effectively alongside analytics, the resulting insight has the potential to inspire operational efficiencies that can make an enormous difference to both patients’ lives and Trust’s bank balances.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Cuts in community and social care budgets are adversely affecting services provided by local care organisations. These were the findings from a King’s Fund NHS Monitoring report, which surveyed NHS Trust finance directors and Clinical Commissioning Group (CCG) finance leads.

The findings have been supported by statistics coming from NHS performance data, showing that over 5,000 patients have experienced delays in being discharged from hospital at the end of August 2015 – this was the highest delay at that time of the year since 2007.

A more recent study on council social care services found that they will struggle to cope with a £1bn shortfall in social care funding this year.

With the government making their stance very clear with regards to NHS funding, it seems as though community care organisations will have to think quickly in order to adapt and improve efficiency and reduce unnecessary costs all whilst maintaining a high-level of service.

Is Analytics the Answer?

A good place to start is in the information environment of a community care organisation. The life-span of technology is quite short compared with most other products and the NHS has developed a reputation in being slow in the uptake of newer, more efficient technologies.

Therefore there is a likelihood that some information systems are outdated and could be made more efficient providing a cost-saving. To find opportunities for improving efficiency, community care organisations must first identify which areas of their information environment are holding them back and whether they have the resources elsewhere to re-assign.

In one example, it was found that savings can be made by relying less on external analysts to build and share reports across the organisation, by using in-house IT staff instead. An initial investment was needed in order to acquire and implement the solution but over time it was proven to substantially reduce costs as the reliance on external analysts was negated because the in-house IT team were comfortable with creating the reports in the new solution themselves. This example rings true for one community care organisation who was able to save £500,000 annually.

Flexible Frameworks and Real-Time Intelligence

Critical to achieving success in any analytics project is collaboration and agility between all departments and clinical staff involved in the reporting and information capture process. By ensuring consistent collaboration at a high-level, a more flexible framework can be established where challenges and other issues encountered can be quickly resolved by steering the project in a slightly different direction.

Real-time intelligence is one such example of the need for collaboration and agility. Often regarded as the Holy Grail of analytics – every analyst wants access to the data as soon as it has been captured. Previously we have been limited by the technology available but there is now the capability to gain real-time insights on data. Unfortunately there are still many community care organisations who are yet to take advantage of this to transform their ability to find opportunities for improving internal efficiencies and service experience.

To support a real-time data intelligence implementation, a total integration of all data stored by the community care organisation is required. This is an area that has always been difficult to deliver as OLAP and in-memory technologies have proven difficult and time-ineffective to work with when bringing in new data sources.

A new approach is needed that is more agile and quicker in integrating data from any source and thus providing a complete picture of organisational performance and efficiency in real-time. With the budget restraints forced upon community care organisations, it means that they have to now look to other more modern and efficient technologies to cut unnecessary expenditure.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

As the world grinds to a standstill with fully grown adults catching virtual monsters from a twenty year old Gameboy game, many missed a big piece of news – another technology has bid farewell and joined the Dodo in the land of extinction. Last week, the world was faced with the death of the VCR.

Of course, it may be more shocking to some that this technology was still being produced at all, with the DVD technology that replaced it being superseded by Blu-Ray over ten years ago. Now 4K Blu-Ray technology is set to become the standard, HD does not cut the mustard – it is all about UHD or 4K, allowing consumers to re-purchase the films for slight upgrades yet again.

For technology, there is not an exact science that guarantees longevity. The renaissance of vinyl sales in the UK speaks volumes, with music on vinyl generating more income for UK artists than YouTube last year. While fully analogue, meticulously mastered recordings reproduced on high-end equipment have the potential to sound far better than a dynamically compressed digital release, many artists today record in fully digital studios and findings point to a consumer base that do not play to the albums they are buying. With consumers buying the same dynamically-limited digital track found on a CD pressed onto a vinyl they will not play, the reasoning behind the purchase is questionable at best.

The point is that for technology, even when there are far more convenient, modern approaches, the previously existing technologies continue to co-exist despite the advancements made. For some the reason can be cost, for others the reasoning is as useless as the shrink-wrapped vinyl dying to be played.

For the analytical landscape, the decision to update from a ‘traditional’ solution to leverage the data available in business should be an easy decision, as the growth of data shows no signs of slowing. This has found to be relevant for companies of all sizes, with a modern approach to data analytics offering a real competitive advantage.

However, findings are uncovering a continuous trend that undermines the opportunities afforded by the ‘big data’ revolution. One study in particular found that while 75% of business leaders claimed to be making use of data available, only 4% are ‘set up for success’ – highlighting the necessity of a well-founded business plan to tackle and understand the profound impact that data analytics can provide.

Challenging the traditional notions of BI is a must for companies to not only understand the data available, but leverage it for decision making. Reports based on pre-aggregated, transactional data from limited sources will only present an incomplete, retrospective picture of part of the truth – why make decisions without all of the information?

This is where technology advancements provide a huge increase in functionality. Search-powered analytics provides not only a faster, more up-to-date view of the available data, but does so without limiting what is analysed and who can use it. Coupled with advances in user-interaction, such as Natural Language Understanding (NLU), the previously daunting amount of data becomes both easy to navigate and manageable in all areas of a fully optimised workflow.

So when evaluating a solution, ensure that is up to the task of utilising the huge volumes of data that make up the modern analytical landscape, or risk getting left behind. The VCR held on for far longer than many thought possible, but that did not make it the most modern, viable medium in the long-run.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Unless you have been living under a rock for the last few weeks, you may have heard of Pokémon GO – the augmented reality mobile game that has swept the globe.

While you will have certainly heard about it, odds are also in-favour of you having at least downloaded and played it as it has already overtaken leading mobile apps Twitter and Tinder for active users.

It would be fair to argue that the Pokémon brand has seen a huge resurgence and much of this is down to the innovative way that they have used big data as the main ingredient of a free-to-play massively multiplayer online mobile game.

Niantic’s innovative approach in utilising big data in Pokémon GO has created an environment where the user base is perfectly happy to contribute to the data capture and perhaps just as important, it’s all in real-time.

How it works

The game itself uses the player’s physical location granted through means of GPS to generate a fantasy world full of Pokémon that the user can then interact with, ultimately capturing them with Pokéballs.

One aspect that must be credited to the game’s success is the immersion it has brought. By utilising a well thought-out data set it has turned a typically boring, non-eventful stroll through the town centre into a hot-spot for catching Pokémon, evidenced by the ‘Pokémon GO meetups’ happening across the world.

At the centre of all this is the fundamental use of big data powering the game. However, it’s not the first of its kind. Before Pokémon GO came another – Ingress, also developed by Niantic. Very similar to Pokémon GO, it encouraged users to visit real-life locations in a fantasy world, with the objective to capture ‘portals’ for the user’s team. The locations of these ‘portals’ now serve as Pokéstops and Pokémon gyms in the Pokémon GO game.

Obviously these games are massively reliant on Google Maps so it is even more interesting that  Niantic is a spin-off from Google, and the founder John Hanke is the inventor of what is now known as Google Earth and the former Vice President of Product Management for Google’s ‘Geo’ division.

With access to this geographical data, Niantic had the means to create Ingress, adding real-world landmarks across the world as interactive portals within the game. Subsequently they opened this up to the user community who contributed no less than 15 million other real-life locations to add to the game as portals. As of this date Niantic has added 5 million of these.

Turning Big Data into a Game

The clever aspect of how Niantic has utilised big data in Pokémon GO can be attributed to the algorithm Niantic has designed in order to spawn Pokémon. It appears as though to ensure immersion is maximised they have created an algorithm to spawn Pokémon specific to a number of conditions such as user’s area, local climate, time and other unknown factors.

By using this combination of geo-tagging (for example water, grassy, urban areas, etc.), climate and time data the whole augmented reality experience becomes more immersive. It makes sense to see water Pokémon spawn next to sources of water, grass Pokémon spawn in large parks and even psychic and ghost Pokémon near to graveyards at night.

To work the mobile game requires the user’s GPS to be turned on and the app to be running at all times. The potential for data capture here is enormous. Niantic is able to track exactly where we go with the app open and has the potential to uncover behavioural insights such as the areas we like to frequent during the working week and weekends, the sort of data that would be the Holy Grail for marketers. There’s no doubt that this sort of behavioural insight would be majorly beneficial to companies in other industries.

The personal benefits users are receiving perhaps serve as an extra motivation for continued participation. As well as being a fun game, delivering nostalgia in the bucket-load, a US study undertaken on Pokémon GO players reveal that 43% of the users who participated in the study have lost weight whilst playing and on average players are spending two hours more per day outside than before.

What Does All of this Mean for the Future of Big Data?

The issue that many have with big data is its massive scale. Extracting the useful data for analysis still proves to be a challenge and any insight that has been acquired often is rendered obsolete because the expiration date of useful data is forever decreasing.

With Pokémon GO, the big data landscape has certainly changed. The problem with big data is that it’s mostly historical and thus has lost that relevance and ultimately is not all that useful. The emergence of Pokémon GO as an example of real-time big data capture could prove to be an essential use case for the future.

It remains to be seen how this data will be used to augment marketing and sales strategies used to target its customer base and whether or not the data will be shared with partner organisations but it may well open the door to projects of a similar kind.

 

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

Described by the NHS in 2013 as an initiative that “will bring together health and social care information from different settings in order to see what’s working really well in the NHS and what could be done better”, the ‘care.data’ programme sounded like a genuinely positive, innovative approach for a healthcare service now under intense financial scrutiny.

The decision to leverage the extensive amount of data available to healthcare professions would not only benefit staff, but help patients access a centralised database of their own records instead of the traditionally disparate data that currently constitutes all areas of the health service.

Instead, the care.data programme has been scrapped. The NHS will not be taking advantage of an opportunity to increase efficiency across the entire organisation and recent findings point to a government that is not willing to invest in the long-term operation of the service, with research uncovering how the NHS has continued to be consciously underfunded, with the spending not meeting the requirements of the Five Year Forward View.

So what is next for the NHS?

Theresa May’s cabinet shake-up saw Jeremy Hunt survive as the Health Secretary, a move which would be considered good for continuity if he had decided to fight alongside the junior doctors rather than against them. For the NHS moving forward, this can be seen as a call for more of the same – it is unlikely that a new, transparent system will ever materialise without the proper funding.

There is no doubt that efficiencies must be made, but the fact remains that the care.data programme was not as black and white as it first seemed, with many experts and healthcare professionals having well-founded doubt over certain aspects of a service that must be reviewed to avoid the same mistakes in the future.

It is not just the public that were concerned, with the chairwoman of Healthwatch Cornwall, Jayne Howard claiming that “Research is the backbone of our work so we fully understand the importance of information gathering, but people need to understand fully what they are agreeing to and how information will be used”, raising valid questions on the use of personal, public data.

This concern is of great importance when considering the 2012 Health and Social Care Act that legalised all data sharing without the consent of a patient. For some, the care.data programme was a step too far with around 80% of respondents to a survey expressing their determination to opt out of the programme.

The major concern was of data privacy, with individuals worried about who would be able to access the data and if personal information would be sold to third-parties. Even after the announcement, it was found that the Department of Health was slow to shut down care.data altogether, with data still available externally.

The shortcomings of the programme must be addressed and fears put to rest if there is ever to be another iteration. Transparency to the general public and healthcare professionals is of paramount importance to ensure the same issues do not recur.

The story of care.data will be one that will not be looked upon lightly and will remain a case of ‘what might have been’. With a culture of data driven decisions being adopted through self-service analytics in the business sector to provide operational efficiency, the NHS will remain an organisation in need of efficiency savings. The data will have to be utilised eventually for the NHS to make cost savings, but 2016 will not be the year it is put into effect.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

The wide-spread availability of analytics has thrust us into a period where we are able to assess past-performance and make informed decisions on how to move into the future. The growth of self-service has only served to open this up to everyone – including those non-technical users.

However, the quality of data fed into our systems can sometimes be criticised and the outcome of this is that it skews our decision-making one way or another, sometimes for the good, sometimes for the bad – but it is still a decision based off inaccurate data.

The objective now is to improve the quality of data we are feeding into our analytics systems so that we are striving to create the strongest decision-making support platform available and thus leverage ROI.

We can exploit ‘data augmentation’ to do exactly this. By combining the data we already have with data from sources coming internal and external to the organisation we are able to augment the data we currently own, providing a much higher accuracy and validity of data due to the additional relevant data added. For a comprehensive definition of data augmentation, Technopedia has a very well-written one.

Quantity Sometimes Equals Quality

Often we hear the phrase, ‘quality over quantity’ but what if in this instance we can rephrase it to be ‘quantity = quality’. That is the mantra of data augmentation.

By supplementing the original data stored within your organisation’s data warehouse with additional data it will serve as a means to carry out more in-depth reporting. Unifying disparate data sources across the organisation and also externally provides a means to add extra variables to the reporting process and provide clarity.

Obviously the issue that commonly arises by adding more data is that it generally adds more complexity especially when applying data governance. Take big data as an example, with the explosion of data we have seen in recent years thanks to data sharing applications like social media we are faced with a data landscape more complex than ever before.

The range of data sources that are available to us now has vastly increased even when compared with just a few years ago resulting in a variety of data source formats ranging from structured data, semi-structured data and unstructured data.

The Big Data Challenge

As the emergence of big data and processes around capturing and analysing it is still in an immature phase, data governance is still difficult to apply with maximum effect. The large quantities of unstructured data that organisations have never before utilised are now seen as the key to unlocking truly meaningful business insights.

It is currently estimated that around 90% of available data in the world is unstructured leaving a huge opportunity for organisations to exploit. Rick Sherman, founder of Athetha IT solutions, a consultancy in Stow, Mass, warns that “Trying to manage or control everything in unstructured data would be a big mistake”. The view here is that a good portion of this data will be worthless and thus the optimum analytical strategies will focus on separating the ‘noisy’ low quality data from the good.

Augmenting the foundation of data already available in your organisation with this additional insight and combining this with a data-driven culture will likely prove to be the key to drive a successful analytics strategy and separate high-performers and low-performers in the new digital age.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

We are now living in a time of great uncertainty. Previously inconceivable ideas have come to fruition, leaving the general public with an unclear future for generations to come.

That’s right – Ikea will soon be selling flat-pack bikes.

If self-driving cars on the road sounded dangerous, imagine the horrifying spectacle of Aunt Mabel straddling a homemade rattling, rickety, Allen-key inspired two-wheeler down the high street.

Perhaps this is a pointless product, a leap into the world of unnecessary DIY that is one flat-pack too far. On the other hand, this is a marvellous idea that promotes environmentally friendly, user-serviceable travel at an affordable cost to the mass public.

This is exactly what product innovation should do – polarise initial opinion by ignoring convention. By challenging the norm, forward strides are made and genuine technology advancements are possible.

A prime example of successful innovation is the MacBook Air, a notebook not released to universal praise back in 2008 due to Apple cutting down the number of ports and omitting the optical drive. Three years later it accounted for almost one-third of Apple notebook sales. The market responded quickly, with online software distribution allowing users to continue installing programs while no longer having to rely on relatively slow physical media. This allowed the size of notebooks across the industry to decrease dramatically since the original launch, with the ease of downloading software and the falling cost-per-GB of flash-based storage driving factors.

A product that innovates is one that takes a risk by driving existing ideals in a new direction, allowing a change of outlook to deliver an approach that is future-proof while meeting the existing demands of users.

For business intelligence, the need to innovate has never been greater, as user interaction with consumable internet services continues to leave an unprecedented amount of data in its wake.

The figures are astonishing, with 40,000 Google search queries per second, 300 hours of video uploaded to YouTube every minute and 500,000,000 tweets sent every day.

By shifting away from the traditional tools that have slowly evolved over time, not only can this data be collected and stored but leveraged in future data analysis.

Innovations in analytic solutions allow the previously inaccessible unstructured data, such as tweets and videos, to not only be individually analysed but analysed alongside structured data to gleam greater insight from a much wider spectrum of records, regardless of origin. This breakthrough has far-reaching ramifications for industries that can innovate based on better understanding and utilisation of customer sentiment analysis.

To combat the surge of data available, innovations in self-service analytics allow user-friendly access to the powerful reporting and dashboarding tools that were previously only available to those well-versed in the analytic field. Now, through the power of search, users are able to quickly navigate through a huge amount of data through natural language queries with the simplicity of an internet search engine.

For end users, this game-changing technology implementation ensures that a new generation of analysists will not need vast statistical knowledge to perform complex analysis. Instead, it will be completed by a wider range of personnel, empowering future workforces to pro-actively drive business change.

As data continues to be collected in both structured and unstructured format, only with innovation can the big data challenges be addressed, utilised and learned from to better understand future outcomes. The end result will be an adaptive, educated and prepared workforce of data analysists who understand the bigger picture, utilising a range of innovative technologies to their advantage.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

I’m sure many of you are as frustrated as me in researching the arguments pro-remain and pro-leave for the EU referendum. If it was anything like my experience then it proved to be a challenge sifting through the arguments peddled to fit within certain agendas.

Apart from the subject of immigration, one of the key campaign elements to sway voters one way another was the results impact on the NHS. In this post I aim to provide an unbiased commentary of both sides of the argument allowing you to make your own mind-up on the future of the NHS following the UK public’s decision to leave the European Union.

£350 million per week funding

One of the more prominent claims made by the Leave campaign was that the money we send to the EU each year, which essentially is paid to retain membership equates to £5.2 billion. Vote leave claim the UK sends £350 million per week to the EU, money that the Leave campaign say can be invested back into the NHS by the government upon Brexit and help provide a monetary boost to the already fragile financial situation the NHS finds itself.

Whilst this figure quoted by the Leave campaign is accurate, it doesn’t take into consideration the rebate that the UK receives from the EU. When subtracted from the original figure, it works out at around £150 million per week – that’s less than half the figure that has been promoted by the Leave campaign.

Even then, there’s no guarantee that all or any of this money will be pumped back into the NHS as Nigel Farage stated. That decision is completely dependent on the government. NHS leaders have already called for quick clarity over this issue as obviously it will go a long way to securing the financial situation of the NHS.

Reliance on EU staff

To help cope with the level of demand asked of the NHS over the last few decades where people are living longer and thus requiring much chronic treatment, the NHS has had to look to EU workers to help support it.

From figures I have found, EU immigrants make up around 10% of the total NHS workforce in the UK and 5% in England. Whilst it is unlikely that Brexit will threaten the jobs of this segment of the NHS workforce, it risks casting uncertainty in their minds and shuts the door on the NHS bolstering its workforce with EU citizens in the future.

The Remain campaign argue that with the transition from secondary care to social care is heavily reliant on workers from the EU and this puts elderly patients and those patients with chronic health conditions at risk. Now that Britain plans to leave the EU, this transition is under threat at a time where social care is deemed vital to the future of the NHS.

The Issue of Migration

Migration, a main component of the campaign for the UK to leave EU also has an impact on the NHS. The argument championed by the Leave campaign predicts that a reduced influx of EU will reduce pressure on our critical NHS services upon leaving.

At a time where the NHS is struggling with meeting their A&E waiting time targets and reports of high numbers of staff absence through stress and anxiety related illnesses, this may well prove to act as a positive for NHS staff.

It should be pointed out that just by leaving the EU and having control of our own borders doesn’t necessarily guarantee that net migration will go down, this is purely down to the government in power.

Conclusion

Essentially, nobody really knows what will happen next. The only thing we can really do is look to what we know today and try to guess what will happen tomorrow – which is what I’ve tried to pull out from this blog.

The impact of Brexit still remains to be seen not only for the NHS but also across the UK and the world. What is sure is that the coming months and years will be fuelled with uncertainty as markets attempt to stabilise and the two year process to leave the EU begins.

During this period, it is critical that the NHS continues to commit to improving efficiency in all areas across the organisation. At a time with so much uncertainty, it is plausible that having the best information and analytical tools could certainly help to improve the quality of health services and to identify opportunities to increase efficiency.

Read more

Self-service Solutions
for Modern Data Challenges

Data Management & Discovery | Analytics & Reporting | Search Engine Powered

Request a demo

What comes to mind when faced with the notion of ‘seeing’ into the future?

While fortune telling and idiosyncratic FBI agents have made supernatural representations in popular culture, the reality has been far more obvious: data analysis.

Without knowing it, you may have already been exposed to the results of predictive analytics; online shopping recommendations based on previous purchases, suggested films to watch next and the adaptive language capabilities of a smartphone are all examples of real-life machine learning that has led to analytical outcomes. The more you interact with a service, the more it can tailor itself to a specific need. In short: the more data the better.

Arguably, one of the most important aspects of data analysis is hinged upon the discovery of trends to gleam further insight. Nevertheless, there is traditionally a final step that has the high chance of skewing or missing key results – the involvement of a fallible human being.

With machine learning, extracting the intricate links can be established without intervention. This is especially important in the analysis of big data, where huge amounts of data can be systematically matched and sorted according to a set algorithm. While the algorithm itself is designed by an analyst, its application is the sole responsibility of the computing program that will be pouring over the data.

The importance of this operation is twofold: not only is the analysis much faster, but it is based on an iterative model – the machine actively learns from new data to adapt its analysis, maintaining the reliability of future results as new, fresh data becomes available. The aim is to produce innovative models and algorithms that can not only evolve with the data but predict future outcomes based on past findings.

Take the new Google car, for example. Through extensive testing, it has been found that the self-driving program is too cautious, following the ‘perfect’ driving rules all of the time. The reaction has been to make the program behave more ‘aggressively’ with the aim to better integrate the computer controlled cars with those driven by people. Examples such as edging forward at a stop sign and attempting to merge with moving traffic are relatively minor traffic violations committed by many drivers that, through machine learning, can be ‘taught’ to the computer program as it drives. Until every car on the road is controlled by a computer, these compromises must be made so that integration is as seamless as possible.

Another scenario is safeguarding, with specific implementations that can reap huge benefits in real-life situations. One example is the data used in the identification of potential child abuse victims. Through machine learning, the data on previous cases can be analysed to make predictions using key identifiers: limitless amounts of siloed data such as hospital admissions, locality, age of parents and school attendance can be correlated to distinguish key variables that may lead to the identification of future cases. The use of Bayesian Networks (BNs) to maintain the validity of the data is vital to the contributory nature of the analytical output. Simply put, the right analysis would help distinguish, prevent and safeguard those who may have gone unnoticed without statistical intervention.

The use of data analysis in business has been synonymous with the necessity to not only better understand the data available, but to increase the amount of data that has the potential to be analysed. As computing power increases while the price of hardware continues to fall, more powerful analysis can be achieved without the traditional analysist-based pre-requisite. Data analysis retains its functionality on a much larger scale and the big data challenges that face organisations can be taken care of using computers that adapt to the new data through machine learning. Predictive analytics has already taken care of our shopping and streaming habits and now it will move on to the bigger challenges that are the corner-stones of big data.

Read more