Algospark Core Insights is a service that designs and deliver dashboards. This means organisations receive key insights quicker and easier. Core Insights consultancy is typically a 4 week engagement that brings together the best of business analytics and visualisation. We define a reporting roadmap and deliver the highest priority dashboards.
Core Insights is an excellent launch pad for enhancing the value of your data and the skills in your analytics team. Many clients also use Core Insights to identify and validate key opportunities for applied AI.
Algospark Azure Core Insights brings together the optimal mix of PowerBI, the Power Platform and Azure Synapse (Azure Data Warehousing).
We are a Microsoft AI Global Partner, PowerBI Partner, Gold Data Analytics and Gold Data Platform partner. Microsoft continually develop their data services and applications so that we can deliver even better services too.
Get in touch so that we can help you with Core Insights and your applied AI journey.
We’re back in the office and have our name on the board at the Impact Hub in Kings Cross! Come and meet us at our new working space in Central London. We would love to discuss how we can help you drive more value from your data, and how we can empower your organization with predictive analytics and applied artificial intelligence solutions.
We are at the Impact Hub King’s Cross, 34b York Way, King’s Cross, London, N1 9AB. United Kingdom.
Algospark has been awarded another gold level Microsoft competency! We are proud to be Microsoft Gold Partner for Data Analytics and Data Platform.
This gold level competency demonstrates our data skills throughout the applied AI development cycle. Our approach spans analytics, rapid prototyping, delivery and management of production systems. A big congratulations goes to our team of applied AI specialists. https://algospark.com/#who
Get in touch to discuss how we can help you drive value with applied analytics and AI solutions.
Ever thought you could get more of your analytics team? Most businesses rely on the analytics team having a great understanding of how the business generates value. They also assume they generate the best insights using the best data that they have. But are you enabling them with the best tools and processes?
Many organisations are still taking Excel file dumps from SQL databases and then painfully pulling together periodic management reports. Management do not like the time required to prepare reports (reporting lag), and analysts are frustrated by the process of pulling together reports. This is before we even consider the potential for data errors and data inconsistencies to creep in.
At Algospark, we use a fast track framework to help super power analysts and reporting systems. Our Core Insights framework helps analysts move from process enablers to insights specialists.
Core Insights is designed to move analytics from reporting to insights. We do this through automated procedures and prioritisation of key impact drivers. Once this framework is in place, analytics teams can move beyond Excel. Your analysts will spend more time on insights. It will also enable them to quickly move into the realm of predictive analytics (what will happen?) and prescriptive analytics (what should I do?).
Our Core Insights service not only focuses on enabling processes and systems, but also the skills and capability of your analysts. We help analysts move beyond Excel to deepen their data skills in SQL, R and Python.
Get in touch! Let us help you super power your analytics team. Contact us at firstname.lastname@example.org
Microsoft offers a wide range of certifications for many types of technology domains and roles. Earning a Microsoft certification proves your knowledge and proficiency. Microsoft offers three different certification categories: fundamentals (knowledge of a particular domain), role-based and speciality (advanced).
Preparation for exams is through hands-on experience and study with either free Microsoft learning paths or paid instructor-led training. Certification requires passing an exam.
We believe the most useful Microsoft certifications for data and AI professionals are:
Azure Data Scientist Associate
Azure Data Engineer Associate
Data Analyst Associate
We also really like the following for a wider understanding of applied solutions in Azure and the Power Platform:
When you spend time and energy learning or using a technology, a certification is an excellent way to prove your understanding.
Algospark Microsoft Certifications
Our staff hold numerous Microsoft certifications. This contributes to our status as a Global AI Inner Circle Partner, Power BI Partner and Gold Microsoft Partner.
At Algospark, we apply our data knowledge and expertise to deliver value adding applied analytics and artificial intelligence solutions. Our fast-track implementation means we launch and iterate quickly, saving you time and money. Get in touch for more information! https://algospark.com
Cyber Essentials is UK certification from the National Cyber Security Centre. It defines minimum IT security policies and helps guard organisations against cyber attack. It also demonstrates ongoing commitment to cyber security.
Cyber attacks come in many shapes and sizes, but the vast majority are very basic in nature, carried out by relatively unskilled individuals. They’re the digital equivalent of a thief trying your front door to see if it’s unlocked. Cyber Essentials is designed to prevent these attacks.
Benefits of certification:
Attract new business
Define parameters of organisation level cyber security
Some Government contracts require Cyber Essentials certification
Customer management can be a difficult process if we do not understand why our customers behave the way they do. Why are they leaving us? Is it because they are not satisfied? Could we have anticipated it and take action before it was too late? Moving from a reactive customer management to a proactive customer management strategy can help you identify customers that are likely to stop buying, or customers that are displaying unusual trends.
Making the most of your data by using applied AI we can help you identify key customers and prioritise those your team should focus on, minimising revenue at risk. Simultaneously, our Azure Proactive Customer Management tool also integrates a recommender engine that suggests what products or services your customer is more likely to buy, and consequently increasing revenue opportunities.
For each customer, we use several AI models to calculate risk probabilities and the potential value at risk. Using data you are already have, such as historical transactions, customer and product details, customer support history, etc. you can take a data-driven proactive customer management approach and take action now, not when it is too late.
After the AI models have finished computing the predictions, the results are delivered in an easy-to-use dashboard where you will find all the information you need to start reaching out to your key customers, focusing on fixing problems and driving revenue growth.
Is it for me?
If you have a large number of products and / or customers, Azure Proactive Customer Management is for you. Examples include:
Manufacturers with hundreds of products and customers.
Retailers with in-store and online customers.
Hospitality, travel and leisure businesses.
Service providers such as telecoms, utilities, insurance and financial services.
At Algospark we deliver applied analytics and artificial intelligence solutions focusing on business value. Our fast track approach means we can launch and iterate quickly, saving you money and time. Get in touch for more information!
Algospark is an applied analytics and artificial intelligence solutions consultancy. Back in 2019, we were not predicting a pandemic, and neither were many others. But this article is not about pandemic predictions, rather how to plan and predict in times of high uncertainty and fundamental change.
The reaction and impact of the current corona virus pandemic has presented huge challenges to society and the economy. During times of fundamental change, businesses still need to plan, forecast and deliver. Fast changing rules with huge economic impact have made for highly uncertain times. We have seen high volatility in economic activity and broken forecasting models across many industries.
However, successful planning and delivery in highly volatile times means data preparation reviews and changes to modelling. We follow these steps:
Understand key impact areas
Use distinct time periods mapped to specific phases
Prepare data with new labels and context
Explore new model approaches and review look back periods
Review, revise and update models
Organizations that have successfully adapted are outperforming during the pandemic. We have worked with several clients during this period to review and update their applied analytics.
The volatility over the last few months has led to several “data regime” changes that have been defined by changing rules and economic behaviour. The mistake is to interpret data from different regimes as useless. There is significant value in “regime modelling” and the insights that it can generate.
Another mistake is waiting until there is a return to pre-pandemic conditions. We believe there will be a gradual evolution, and that adaptive forecasting and proactive customer management is the best way to respond to evolving conditions.
In addition to making business planning much more difficult, the pandemic has also increased focus and pressure on customer service. For many individuals, the pandemic has meant more free time. Unfortunately this has come at the same time as many customer support teams have moved to new ways of working (remotely), with less head count and higher levels of customer calls. Succeeding in this environment underlines the importance of good planning, predicting and prioritizing. After all, good customer service is routed in anticipation of needs.
We have a portfolio of data science led frameworks to solve the challenges of the pandemic. If you are looking to review a forecasting process, refresh your applied analytics approach or wanting to proactively re-engage with your customers then get in touch!
We are building for pandemic recovery and would love to help you.
We are proud to launch Algospark Azure Proactive Customer Management. We enable you to move from reactive customer management to proactive customer management. Put more simply, get ahead of customer issues before you are called about them. Get in touch to discuss how we can help you!
Value of proactive customer management:
Increased revenue per customer, increased customer satisfaction and reduction in churn.
Anticipation – move away from reactively responding to calls to proactively managing calls.
Prioritise the order of calls based on value. Identify and minimise customer value at risk.
Map insights to action, and align to promotions and pricing policies.
Create a virtuous learning loop for insights and successful actions.
Azure Proactive Customer Management builds on: Azure Cloud, Azure Machine Learning, Synapse, Power BI, Power Apps and Power Automate. We also have easy connectors to align with existing activities and processes with Customer Relationship Management (CRM) systems and other customer support applications.
Please get in touch for a full demonstration. We would love to discuss how we can help tailor Azure Proactive Customer Management to meet your specific needs. Contact us at email@example.com
Well done to the team! We are proud to demonstrate our competence across Azure solutions to become a Microsoft Gold Partner for Data Analytics. This is in addition to our status of Global AI Inner Circle Partner, Power BI Partner and Silver Partner Microsoft Data Platforms.
Algospark helps clients identify opportunities and fast track value using predictive analytics and applied AI. Get in touch!
The UK Government Dynamic Purchasing System (DPS) Marketplace provides access to all procurement run by Crown Commercial Service. Buyers access framework agreements that meet common purchasing requirements across government.
Benefits for buyers : ● Aligns to government standards and guidelines, including the Data Ethics Framework and the Office for AI’s Guidelines for AI Procurement. ● Promotes standards and criteria for artificial intelligence and data. ● Ethical considerations when innovating and buying artificial intelligence. ● Intellectual Property Rights in the AI market. ● Ensures the appropriate suppliers are accessible to provide the right service offerings, to reduce procurement timescales and ultimately to provide an easier route to market for the type of AI.
Get in touch with us to discuss how we can help deliver applied analytics and artificial intelligence solutions for government.
Transfer learning is a way for data science experiments to build on existing successful models. It has been popular in Computer Vision where open source deep learning models trained on ImageNet are commonly used as a base for training more specialised models. Using the same methodology in NLP (Natural Language Processing) means next generation machine comprehension models do not have to start from a blank canvas.
Open source models for machine comprehension are now widely used as a viable alternative to building new projects without a base model.
Transfer learning usually follows one of three paths. 1) Re-train all weights of an existing model architecture, 2) freeze some layers and train others or 3) freeze the entire architecture and model layers. The easiest starting point is usually the last method, particularly when your training data set is small. The other approaches to transfer learning are typically used as part of later stage model testing and tuning.
BERT (Bidirectional Encoder Representations from Transformers) is a popular model for machine comprehension. Developed by Google AI, it also has several specialised flavours (eg RoBERTa for sentiment). BERT has been trained on a large amount of unlabelled text including Wikipedia and Book Corpus (over 3 billion words). A quick web search will explain how to implement BERT for numerous NLP (Natural Language Processing) tasks such as spam detection or chat bot.
Beyond BERT there are many more open source NLP base models. HuggingFace is particularly active in providing access to libraries and API’s. As they state on their site, “solving NLP, one commit at a time”.
OpenAI‘s model GPT3 (Generative Pre-trained Transformer) is also a popular starting point. It has been trained using 175 billion parameters. The size of such models naturally limit the ability for NLP practioners to import and build, so API’s are the natural interface.
There are numerous approaches to solving NLP challenges. These include, but are not limited to: language used, time periods, technical considerations, size of data sets and general considerations of the use case. However, building on the shoulders of NLP giant models is certainly a good consideration for many use cases in NLP.
PowerBI is at the heart of Algospark Core Insights Consultancy. We map, prioritise, design and deliver dashboards so that organisations receive key insights quicker and easier. Core Insights saves time on key report preparation and reduces time from insight to action. It can also identify opportunities for applied AI projects. A simple PowerBI example of model outputs can be found here . Get in touch to discuss how Core Insights can help you.
Here is our AI Use Case Grid. We use this to shape early discussions about where AI can drive most value from an organisation. It allows us to explore opportunities, quantify value and develop project prioritisation and roadmaps.
The grid maps to high level organisation processes and then links to core outputs of AI models to help understand existing organisational capabilities. This allows us to frame potential gaps and determine the value add from applied analytics and AI.
Get in touch to learn how we can help you fast track value from analytics and artificial intelligence.
Algospark recently gave a presentation to CEO’s and industry pioneers at the YPO organization about how to fast track benefits from applied artificial intelligence in the asset management sector. We kept use cases broad so that they would be relevant across the asset management industry, i.e. from large retail focused wealth managers through to boutique fund managers.
Here is a summary of the use cases that we covered.
Use Case 1 : Customer and idea clustering
Focus area: New sales opportunities,
Why: Increase sales funnel size and quality
Benefits: Sales increase by x%
Use Case 2:AI driven insight – news analytics
Improved trading performance
Why: Generate insights, focus on the right things at the right time
Benefits: Consistent analytics framework and analyst time saving of x%
Use Case 3: Outlier detection / compliance monitoring
Why: Spot bad early
Benefits: Regulatory compliance, reduce existential threats and compliance review time saving of x%
When evaluating the most effective way to implement AI solutions we nearly always advocate a business transformation approach using micro-services architecture. This ensures good business adoption and reduces time to solution. It also makes sure that the solutions are scaleable and flexible.
High value add AI is rarely “off-the-shelf”
Needs to be defined by business priorities and impact assessed across: process, roles, data and technology
Define a future state
Build a change map, prioritised list of solution requirements, business case and project plan
Define and build a rapid working prototype (R, Python, Shiny, Django)
Develop the prototype in a sand-box for low early IT dependency
Use agile delivery to iteratively deliver every 2 weeks
Each service is self-contained and implement a single capability (avoid creating IT “Gordian knots”)
Interface across stand alone micro-services using API (Application Programming Interfaces)
Use a “pick and mix” approach of micro-services to deliver overall service functionality
Use Case 1 Overview: Customer and Idea Clustering
What: clustering customers and ideas to increase sales success and productivity
Why: improving sales idea success by x% and reducing overall ideas sales time by y%
Clustering customers and ideas to prioritise target lists for investment ideas
Run through an appropriateness filter for idea / customer suitability (classifier)
Use recommender systems to determine probabilities of sales
Quality of CRM data (customer type, portfolio, objectives, recent activity)
CRM data architecture and API
Personal data protection using pseudonymisation
Data hosting and computation (legislation)
In-house data and analytics capabilities vs third party provider
Customer and ideas heatmap / dendrograms:
Key considerations for project delivery success:
Realistic assessment of in-house capabilities and support
Speed to insight considerations, not everything needs to be real-time
Process and focus of roles will need to change to realise benefits
Good data is good solutions: the impact of data governance, data architecture and maintenance
Flexible technology (transition to API driven micro-services)
If you are interested in learning more about this use case, the other use cases, or successful approaches to AI implementation, please do not hesitate to get in touch.
Algospark: applied analytics and artificial intelligence solutions.
Federated learning is a form of distributed model training where data remains on client devices. This means data is not passed directly to the coordinating server. By implication, models learn (ie train) using personal data sets without actually seeing the underlying data. Currently there are limited results from federated learning using models actually in production. However, in a world of GDPR and increased data protection legislation overall, this model training methodology is likely to receive much more attention in the world of AI and machine learning.
See an example of federated learning approach with R and Tensorflow at the link below.
We are proud to demonstrate our competence across Azure machine learning solutions and become a Microsoft Silver Partner for Data Analytics. Algospark is a predictive analytics and applied Artificial Intelligence (AI) specialist. We help clients identify opportunities and fast track value from AI. Get in touch!
We are proud to announce that Algospark has joined Microsoft’s AI Inner Circle Partner program. The program includes specialists that are able to provide custom services or enhanced AI product solutions using Microsoft AI technologies.
Algospark works with numerous delivery partners to deliver the best solutions for our clients. We are excited to be working with, and to be recognised as an AI specialist by one of the industry leading companies in AI technology development.
Congratulations to the visionary Legal Team at Pernod Ricard Global Travel Retail for their “highly commended” award at the British Legal Awards 2019! Algospark are proud to have designed and delivered the enabling AI technology and service.
The “AI Approve Tool” reviews advertising image compliance with laws and corporate standards. It delivers multiple rule checks across multiple countries to generate an approve, refer or reject decision. This ensures decision consistency and significantly faster process time.
The web based solution brings together a suite of translation, natural language processing, computer vision and classification algorithms. It has been built to scale using a micro-services design with containers to facilitate a fast production ready AI service.
As part of the launch of “Future Decoded 2019”, Microsoft UK have released this report, Accelerating Competitive Advantage with AI. It contains tips on successful approaches to applied AI, survey results from UK enterprises and use cases. It also reiterates key messages that we use at Algospark:
AI is beyond data, it is part of a business transformation process.
AI is key to competitive advantage.
AI needs change management to ensure success.
Happy reading! Get in touch with us to discuss how we can help you accelerate innovation using applied AI!
There has never been a better time to explore new innovation opportunities using analytics and applied AI solutions! How do you get started? At Algospark we use a portfolio of existing solutions and fast prototyping to minimise time to solution discovery. It is OK to aim for 80% of the solution. It can be iterated later. If you’ve launched at 99% accuracy, you’ve launched too late.
Another method to accelerate the time to solution is to develop in a sand pit. These are also called innovation spaces, test beds or labs. It means using an environment that does not need to fully integrate with existing IT systems from the beginning. Using prototypes in these environments helps quickly migrate to full solutions further down the road.
We also use a modular approach, also called a “micro-services perspective”, when we build solutions. This means that each component in the solution is added with minimal dependency on other components. It makes the solution much more flexible and easier to align with other systems and processes later in the development cycle.
The development methodology should be agile and iterative. It should also have a DevOps perspective, ie quick to update and deploy, and easy to manage. It should also have a DataOps perspective. This is a set of processes used to improve quality and reduce the cycle time of data analytics.
Throughout the solution building process it is critical to ensure end user adoption. This is where applied data science meets business transformation. New process and services always need to be in context of user adoption. They also need to be developed within a strong business transformation and change management framework.
Contact Algospark! We have a portfolio of solutions, frameworks and existing products that minimise development risk. We use tried and tested methods of quickly realising value. Speak with us about how we can help you.
“A significant constraint facing firms looking to grow their analytics
capabilities is their ability to acquire the right talent. After identifying what talent and skills are needed, businesses are then finding
that these skills are in short supply.”
“With demand outstripping supply, individual businesses are faced with the challenge of how to attract highly sought-after talent. In many cases, although the skills being sought after are highly specialised, often they are sector agnostic. As a result, firms across sectors are fishing in the same pool for talent.”
“In a competitive market where analytics skills can command high salaries, companies struggle to compete with the world’s leading digital companies. Companies whose brands aren’t known for providing exciting opportunities for employees to use their data and analytical skills find it hard to reach their target audience. ”
Speak with us at Algospark to discuss how we can help fast track your innovation initiatives using applied analytics and artificial intelligence solutions.
Want to reduce complaints, focus on customer priorities and reduce churn? You can do this by using Natural Language Processing (NLP) and customer profiling. We convert text to metrics, and then to emotions. This helps quantify what is important, and then allows tracking of impact to ensure the best reactions to customer feedback.
There is no point listening if there is no action! Algospark build tailored frameworks for feedback that track, aggregate and map insights. Verbatim feedback quantification is a key foundation to building strong customer relations. See a basic example here
We are predicting that Arsenal win the Arsenal v Chelsea fixture this weekend. We use an ensemble of three AI models to predict home, draw or win results. Our models use player data, manager data and 5 year historic fixtures data to determine most likely outcomes.
We calculate match outcome probabilities, decimal odds implied from these probabilities and how the predictions performed last week. The matches are ranked in order of confidence of the prediction (from high to low).
Don’t hesitate to get in touch to discuss further. firstname.lastname@example.org
DISCLAIMER: These predictions are guidelines only, Algospark takes no responsbility for the accuracy of information or any losses incurred resulting from decisions are actions taken using these predictions.
Data is everywhere, the insights are exciting and it can power the next generation of systems investment. This is the accepted wisdom, but how do you get started? Unless you work in a technology led business, IT teams rarely lead business innovation, yet they are typically the key gatekeeper to unlocking the power of organizational data. The evolution of IT to cloud, dev-ops, micro-services and containers should be keeping the IT team busy. This is even before considering master data, data lakes, governance and moving away from traditional Extract Transfer Load (ETL) procedures.
Exploiting new revenue opportunities and cost savings from data needs to be driven using a business transformation lens. The priorities and needs of the business balanced against speed and risks of implementation are critical success factors for any data science initiative. This is difficult for most people to conceptualize and is why rapid prototyping in “AI Innovation Hubs” is an excellent way to demonstrate concepts and likely benefits. Seeing the results of a prototype along a business case and agile implementation plan is excellent way to rally key stakeholders to further develop and launch the initiative. This should be done outside existing IT and data architecture, but mapped into how it can be “productionized” as part of the plan.
AI Innovation Hubs are key to kick starting new and exciting applied data science projects. Working within the confines of existing data insight normally means working within processes and parameters of existing IT. So it is much better to work outside existing frameworks, but co-developed with data insight teams in the AI Innovation Hub. This ensures:
Innovative new projects help uplift and augment current procedures.
Upskill priorities are easily identified and implemented with exciting hands on training initiatives for analysts.
Learnings from prototypes can drive incremental changes to wider data engineering and approaches to ELT.
Overall, these benefits will lead to wider organisation efficiencies from faster access to more relevant data using less processing time and less analyst time. This ultimately results in significantly raising efficacy from insight, and substantially lowering costs.
Want to get started with an Innovation Hub? Get in touch!
Looking to take your cycle to work? Buying a new house? Worried about your walk to work? Take a look at the Algospark crime predictor that pulls together the crime data patterns from the last 3 years in the UK to predict crime hot spots.
Algospark Crime Predictor has been designed to predict crime by type and by postcode sector using AI. Not only does it help with better crime prevention planning, it also feeds investment decision frameworks to improve outcomes regarding new properties and new business locations.
The new age of computing and data has unleashed a whole new approach to scientific discovery. I was lucky enough to spend a few hours at CogX18 and listen to Zavain Dar (https://twitter.com/zavaindar) at Lux Capital explain how thinkers of old can move away from the basis of ground truths towards a data driven relationship, free interpretation of how things link together. There is no need to understand the ground truths, nor the myriad of supporting hypotheses and assumptions in order to prove a concept. It is now possible to apply cognition beyond human understanding to map input X to outcome Y using machine learning / deep learning / cognitive computing.
Pick an historic scientific legend, eg Newton. Imagine the steps, process, ground truth understanding and empirical proofs required to explain gravity! Now imagine taking readings, mapping to outcomes and feeding the results into a deep learning model. With enough iterations, the core concepts and relationships will be implicitly mapped.
In medicine, for example in the field of dermatology, applied machine learning is already yielding diagnoses that are better than those obtained from human specialists. This is saving lives and reducing the amount of specialist input for critical diagnoses.
New radical empiricism is here! It’s real and can be applied to a wide range of fields. However, it is especially relevant with use cases in which ground truths are notoriously difficult to prove. Healthcare and complex system outcome prediction are perfect. This also implies that economics and investment management with their myriad of nested assumptions are also front and centre of the new radical empiricism wave.
Algospark are developing predictive analytics solutions across equity investment, retail location investment and complex systems prediction. These are front and centre for applied new radical empiricism. NRE is nascent, but a great concept and one that will receive increasing focus in coming months and years.
Algospark has just released Crime Explorer which is an interactive crime exploration tool for the UK. Developed for use on desktops, it is a visualisation tool that shows reported crime data by type and location across the UK during 2017.
Knowing and understanding crime patterns is invaluable for location analytics and to support investment decisions into new areas and locations. Data used by Crime Explorer is from data.police.uk and categorized by month, location and type of crime. The data is rich and can be quickly interpreted and compliments the suite of predictive analytics tools of Location Spark.
The demo version includes data on demography, house prices and concentration of eateries. It is an interactive tool that provides data at the national level and pop-up data at the postcode level across the UK. It has been developed for use on larger screens and for presenting snapshots to compliment wider location analytics projects.
Map Explorer is a great compliment to Location Spark which is a mobile centric, sales pattern prediction tool for any given postcode.
Location Spark is a new service from Algospark. Location Spark provides location analytics for investment decisions in new retail sites. It is a flexible framework that has been co-developed with fast growing UK retail networks. Location Spark brings together a multitude of location data sources, operational metrics and artificial intelligence to predict sales, the type of store and trading patterns.
Reduced analysis time (20-40%).
Increased speed to decision using robust and repeatable process.
Increased forecasting accuracy and minimised probability of poor new site selection.
Rapid automated site evaluation at low cost with great “return on analytics”.
Proactive account management means that events and opportunities are predicted so that customer engagement and service offerings can be optimised.
When implementing a proactive approach, questions your account management team should be considering:
Which accounts are most likely to leave?
Which customers have a very high probability of buying additional products?
What products should I recommend to potential leavers?
It’s worth remembering that all the problems do not need answering at once. It is not necessary to launch a programme with large investment project teams, CRM and IT infrastructure. At Algospark, we generate insights from data and then develop rapid prototypes to fast track value from artificial intelligence solutions.
How does this approach work with the proactive account management questions?
Which customers are most likely to leave? This link shows an example of a churn management solution uses a Value at Risk (VaR) approach to prioritise client contact.
Which customers have a very high probability of buying additional products? This link shows an example of customer centric product recommendation. It uses a hybrid collaborative filtering approach to determine the products with the highest chance of purchase.
What products should I recommend to potential leavers? The solutions above can be combined so that customer management teams have a script that is highly personalised to the client in terms of behaviour and preferences.
Deep learning networks are infamous for their ability to detect cats in images. Advances in computer vision and the application of Convolutional Neural Networks (CNN’s) have yielded exciting advances in image classification and computer vision applications. CNN’s are used to classify images and identify the objects that are in them. They essentially translate pixels values to information about what is in the image. There are often many layers between pixel values and outcomes. The layers in these networks can be used to determine the style of an image. Early layers tend to identify lines or colours, whereas later layers identify more complex objects and derivations.
Combining data that has been generated from 2 images that have passed through CNN’s allows a principal content image to be mixed with style from another image. Content and style are weighted, and the algorithm iterates through numerous passes of the images to align the images. The style of an image is derived from comparing convolutional channels filters and the correlation between them to produce gram matrices. Further details on the approach and specification can be found here.
We have been experimenting with various content images and style images over the 2017 holiday period. Although the commercial value of such image generation is difficult to quantify (as with traditional art), the neural style transfer approach allows AI to generate amazing new vivid images by combining a “content” image and a “style” image. We have posted various examples to the Algospark Neural Style Transfer Art gallery. These can be found here:
How can you offer great service without an exploding product list? Meeting an increasing number of customer needs from a growing list of customers can lead to exponential growth in product offerings. Do you really want to be the one stop shop for everybody for everything?
Most organisations follow the 80:20 product rule. This means that 80% of customers buy 20% of the product offerings. Products that are not in the top 20% make up part of the “product long tail”. Whenever there is an efficiency drive, these products typically appear in the cost saving table of a PowerPoint presentation. But these products have been developed to meet customer requirements, and are nearly always part of a portfolio of products that customers buy. How can product investment or divestment opportunities be made for specific products without jeopardising customer relationships? How should the product tail be cut? Or more importantly, what new products should I recommend to customers? The answer is learn from supermarkets and shopping baskets.
Market basket analytics and product graph analytics are excellent ways to determine “hero products” and the dependencies with other products. These type of analytics measure products by their support (% of transactions in which the product appears), confidence (probability of buying product X if you also buy product Y) and lift (strength of product inter-relationships). Product portfolio dashboards are an excellent way to visualise these metrics. They allow fast understanding of key product relationships that make it easy to determine core product clusters and the most important product associations. This can then be linked to evaluation of product financials (ie sell products that make money) and development of recommender systems (suggest products that customers want).
So using a product portfolio analytics tool will help keep product development in line with demand patterns. It also helps guide customers to more consistent product portfolios without “exploding” the product list.
Do you write a blog? How does it fit with your marketing and content strategy? How does your blog impact new traffic that visits your site? OK, enough questions. At Algospark, we were interested in a fast prototype to assess web traffic and how the blog is driving interest. We pulled together blog scraping, Google Analytics, predictive analytics and rapid dashboard prototyping to assess what is going on with the Algospark blog. As usual data, analytics and prediction are at the core of our interest. Having a better understanding of our content mix and traffic impact should help improve this blog. Read more about the concept here: https:\\algospark.com\#ideation
This is a simplistic first step, but gives great insight into the content mix and how it drives traffic. The application is predicting an 8% uplift in traffic over the next 4 weeks from this article. You can see how the impact evolves, our traffic dynamics and the updated forecast here.
Location selection is key to offline business growth. A large amount of resource is usually involved with site screening, location visits, analysis, prediction and investment review. Algospark Location Analytics has built a framework to expedite the process, make the approach more consistent and reduce the amount of time spend screening and analyzing.
Site location involves numerous factors including: economic, demographic, size, customer experience, competitor and proximity outlet considerations. These factors often have complex interactions. And this is where a machine learning framework can help.
Getting the most from location analytics involves taking into account the insights from existing locations. This can be augmented by taking a cluster approach to locations. The value from machine learning comes from using a consistent approach that takes into account multiple location variables for inference and boils them down into a go / no- decision, supported by a projected sales forecast and profitability metric. Pulling all the factors together into a consistent framework avoids the painful work on multiple table and factor comparison.
The outputs from this quantitative site evaluation should then be used in conjunction with qualitative overlays such as site visits and site traffic analysis. Our approach to location analytics saves time and ensures consistent decision making to site selection. It also provides more accurate new site sales projections and trading patterns from the outset.
See how we make the process easier and reduce the risk of a failed new location on the links below.
Artificial Intelligence (AI) does not replace analyst roles. It adds to capabilities of analysts, and productivity of the wider team.
Machines need to learn. Like humans, they are not built knowing, but need to be trained. The key advantage of machine learning is performing repetitive tasks with high accuracy across huge swathes of information. Machine learning insights always have room for error, and so need oversight. This is also known as “humans in the loop”. This has been around for a long time, for example pilots, and auto-pilots. It will also be around for a long time to come.
Good analysts need creative thinking and need to be able to contextualize. Machines do not innovate, they process and evolve. This means that the role of analysts has shifted and will continue to shift. Rather than focus on tasks in which machines excel, analysts can get to the next level using machines. This is also known as expert automation.
As machines train and learn, analysts need to do the same. The opportunities for analysts has increased, and so has the rate of learning. So as machines train, analysts must also train. With the growth in online learning, this has never been easier. If you are not keeping up with analytics skills, other analysts undoubtedly are. This means that other analysts will be taking your job, not the machines!
Sophisticated modelling can seem daunting for many organisations wanting to become data centric. Do you have huge databases full of data with no duplication, no data gaps and all in the same format? Is the data tagged, have clear ownership, access rules and update rules? It is linked to other relevant sources and mapped to the core business domains and processes? If you have this, great. If you don’t, you are in the majority.
Leading edge AI and machine learning models need quality data that are linked to business outcomes and have labels. This allows algorithms to “train” faster, then “learn” and “evolve”. So quality data drives quality insights, but requires tailoring to needs. In other words, data needs quality controlling and labels attached that link to outcomes and business processes.
Labeling and linking creates data assets that drive value. This means that your differentiation is your data. High volumes of good quality data are the foundation of analytics, insights and artificial intelligence. Your data can always be linked to third party data, but in essence, the more quality proprietary data that can be harvested, the larger the potential data advantage.
The analogy that “data is the oil of the 21st century” applies to data quality. Low grade data (and oil) are expensive to mine and process, offering limited value relative to high grade offerings.
So large volumes of data (a data lake) is great, but your data story needs to map with user journeys and business processes. This is where business transformation meets data science. Business transformation involves mapping out where you want to be (the future state) relative to where you are now (the current state) to determine what to do and where to begin. This could be either an opportunity to save on processes or some new product ideas. The scoping and prioritising activity is a key phase that will inform what data is needed, when and what for.
So the recipe for success is to know what, why and when data is needed. Then ensure quality data is available with the correct labeling (“meta-data”) and outcomes mapping. In summary, ensure data quality and robust operations are in place before letting the algorithms loose. After all, your differentiation is your data. Garbage in = garbage out.
Steps to differentiating yourself with data:
Define business areas with biggest potential for value
Start small on data mining, but develop core quality data sources and data processes
Design and build early stage algorithms knowing that they will initially be early phase prototypes