Our Case STudies.
Real examples, Real results.
We deliver Value
In many industries.
The Data Analysis Bureau’s data scientists can use data to obtain value for your company in any number of different ways. The possibilities are almost endless and our data scientists can help you explore these. You may already have a lot of data available already and some you may need to collect.
Embrace Industry 4.0 and drive intelligent solutions through the adoption of machine learning and predictive analytics to increase production and quality, reduce costs and waste, and manage production remotely.
Increase customer engagement through personalised experiences and recommendation engines and maximise revenues by improving demand forecasts and optimising supply chains.
Improve operations by automating repetitive tasks and engaging chatbots as virtual assistants for consumer interaction and sales agents, and predict market trends to secure the competitive advantage.
Enhance patient care and diagnosis, whilst reducing costs and improving response times through automation and improved medical imaging. Increase insight by collecting device IoT data and rapidly summarising research for discovery.
Improve decision making and increase profitability by better understanding your customers and market trends to recommend services and solutions, and reducing risk and anomalous behaviour.
Optimise asset performance through improved outage management and energy distribution, predicting asset anomalies to increase resolution time and better understand customer needs and pain points.
The challenges presented by our clients range from simply increasing supply chain visibility with interactive dashboards to optimising machine performance through machine learning driven AI. We’ve delivered a range of exciting projects for our customers utilising our data accelerator framework.
It provides the support to identify a unique data roadmap, clearly communicating available services to select and build solutions, and manage on-going operations.
our DATA IN ACTION.
Prediction of spoilage and failure events in the manufacturing chain for a leading packaging manufacturer.
A global manufacturing company was looking to bring predictive analytics to its packaging production line.
In particular, they were keen to understand how machine learning could be applied to reduce machine downtime and spoilage from production errors.
T-DAB initially used one years worth of data to use machine learning to firstly mine the dataset for key influential features from an initial list of 64, and then apply machine learning to predict spoilage and tool failure events within future time periods. Included were machine state, output quality, tool life and operational data.
T-DAB first carried out a data audit, cleaning, and wrangling exercise, followed by feature engineering. Machine learning experimentation was carried out in R.
The end result was that a number of ML algorithms were produced able to predict spoilage and tool failure events to a degree of accuracy significant enough (>80%) to have real world impacts on operational processes in reducing spoilage and downtime.
Statistical Modelling and ML Driven Data Mining of Variables Predicting Consumer Behaviour.
A FMCG company needed to improve their consumer modelling and analytics to drive their retail and marketing strategy.
The marketing teams needed to run multiple scenarios to understand how changing consumer perceptions and targeting certain demographic groups may allow them to alter the market share of different products.
T-DAB developed an automated data mining process and leveraged machine learning algorithms in open source R to help the client better understand where to focus their resources and develop a strategy to target specific consumer groups and market sectors. This involved both automated machine learning processes and inferential statistical modelling methods
The team then built on this and used R-Studio to build an analytical tool, driven by machine learning and statistical models, to allow users to interactively explore consumer relationships and test market scenarios.
Application of Machine Learning Driven AI to a Cutting Edge Manufacturing Company.
A UK manufacturer needed to optimise their composite material production to reduce operational downtime and development cost.
In addition, the client works with experimental masses of material, often operating beyond the current understanding of how these composite materials behave.
This meant the client needed to regularly change, test and review the production setup, often slowing production, increasing costs and risking delivery.
T-DAB designed a solution incorporating machine learning driven AI into the machine calibration. This would enable the manufacturing machines to dynamically adjust and optimise their function while still in operation, saving development time and cost.
A holistic approach incorporating Azure cloud distributed and localised computing services, edge deployment, IoT, and cutting edge deep learning frameworks (Deeplearning4j) was proposed. This allows for the machines to be calibrated as part of the learning process (i.e. training algorithms on data of inputs and outputs during operation), and in turn will allow the machines to dynamically adapt and optimise their function while still in operation.
Productisation of Publicly Available Insurance Solvency II Data for B2B Marketing.
T-DAB scoped, trialled and built a new visualisation product by helping the client quickly scope and assess their data product requirements and build a clear development roadmap.
The team provided an end to end service providing expertise to guide them through the selection of the appropriate technology for delivery of the product, first building a proof of concept, followed by a minimum viable product, and a first go-to-market. The solution was delivered using a combination of on premise My SQL database and Tableau visualisation software
Cloud architecture and machine learning to enable predictive analytics for a super-material manufacturer.
T-DAB designed and built a suitable AWS architecture to batch ingest and database test data from individual .csv files
This consisted of an automated process for file upload to Amazon S3. A scheduled Amazon ECS C# process pulled bucket and data inserted to MS SQL database. For security, this was contained in a private subnet. An amazon EC2 R server instance in a public subnet was connected to MS SQL DB. IAM role and security group restricted access to ECS and EC2 only. An elastic load balancer in a public subnet above the EC2 R Server instance subnet and IP access was restricted using Security Groups. The R server instance was connected to provide an analytics layer. This was used for training ML regression-like algorithms to predict super-material performance.
Project mobilisation and advisory for super materials manufacturer.
T-DAB have been working alongside the strategic projects directorate and technical team leads for the last 6 months.
T-DAB have delivered a range of services. This has included; strategic consultancy on the value and feasibility of data projects, scoping and planning of data projects, liaison with key technology providers, solution architecture design, advisory on technology choices, and recruitment advisory.
T-DAB have also provided data auditing services, data exploration, as well as PoC work to help inform project planning and long term decision making.
Whatever the case, The Data Analysis Bureau works closely with you to define your challenge, even before you begin collecting data. We give clients the ability to identify the value of their data, obtain the best return and mitigate risk, and deliver insights that drive better decisions.
Although autopilots are at the helm of sailing boats during large parts of the most prestigious sailing races, their inner workings remain largely untouched by machine learning and its opportunities. Explore how we developed a reliable digital twin of a sailing yacht that allows a deep reinforcement learning algorithm to learn intelligent steering behaviour.
Over the last few years, considerable progress has been made in the field of Automatic Text Summarization- the branch of Natural Language Processing concerned with building programs which can automatically summarize written content. This article, the first of a mini-series on this area, gives a short outline of the field and an overview of some of the leading approaches.