Video Analytics with In-store CCTV camera feed for Sports Goods Retailer

When video analytics was initially adopted by retailers, it was done so with the aim of reducing or even avoiding losses from incidents such as shoplifting and employee fraud. It was seen more as a security protocol and nothing more.

However, today video analytics has moved on to being something much more – it has grown to be a tool that can be leveraged for business growth. The video surveillance market is set to hit $ 82 billion USD by 2025. The adoption of video analytics by retailers has increased by 16% year on year for the last 7 years.

Video analytics delivers value to retailers in many forms – in terms of measuring store performance, enhancing the customer experience, increasing engagement with customers and ensuring customer loyalty.

Check out our Advanced Analytics Services

Read More

Many retailers face the issue of high footfalls but disproportionate point of sales revenue. The use of video analytics allows retailers to not only make decisions based on footfall data, but also gives them the liberty to drill down further and analyze the following:

  • Whether consumers compared a particular product with other brands?
  • Who are the returning customers?
  • Who exited the store without purchasing anything?
  • Whether they looked at an item or spent significant time in a particular area of the store?
  • Whether they picked a product?
  • Whether they bought multiple products?

Indium’s cognitive analytics capabilities helped one of the largest sporting goods retailers in the world to increase their customer satisfaction levels by more than 15%.

The client had 1500+ stores across more than 45 countries. Being a giant in the space and competing in the fast-moving retail market meant business decisions and market strategies needed to be flexible enough to change quickly as well.

To achieve this, there was a need for real-time store visitor analytics. Footfall data is what most companies use today to achieve this. However, the client wanted to step it up by linking footfall data with POS data. This needed a robust yet simple solution that generated accurate results.

The requirement laid out by the client was to effectively improve performance of the store and increase customer satisfaction. In order to meet the requirements, Indium had to:

  • Leverage security cameras across the store to generate a store heat-map.
  • Leverage CCTV camera data to understand the variations in footfall by area of the store and time of the day.
  • Identify customers using facial recognition to know the number of customers leaving the store without making a physical purchase.
  • Build comprehensive dashboards with real-time data refresh for better insight into customer’s behavioral as well as shelf zone analysis.

Seeing this as an opportunity to solve a difficult problem for our client with an out of the box solution, Indium implemented the following solution:

Solution Implementation:

Indium analyzed the video feed data collected from the cameras installed on the shop floor and built a cognitive analytics model solution that leveraged the data to meet the requirement:

  • Firstly, Indium used ImageAI to empower security cameras to count the number of customers who enter the store at a given time period. In addition to this, customized functionality was incorporated which allowed the cameras to count the number of people who performed a particular activity – like walk to the cycling section of the store.
  • A neural network model was built for image processing and video analytics. The model was analyzed and optimized in order to ensure maximum accuracy.
  • Within the data points that were gathered, outliers were identified as the next step of the solution.
  • Pattern recognition was in place around all the shelf zones.
  • Even though this was a very complex neural network to build, there could be zero compromise on accuracy.
  • Each layer of Neural Network was built by training various classes of the object or person, where 2000+ images were used.
  • Annotations were created and tested using sample videos. This particular neural network model which was built would become more and more accurate over time – more the number of videos used, higher the accuracy level over time.

Leverge your Biggest Asset Data

Inquire Now

Business Impact:

  • An accuracy of more than 80% was achieved in the models specifically built to target customer behavior. This ensured improved customer engagement and better targeting of customers.
  • Comprehensive analysis on the conversion rate from visitor statistics to live sales via the POS systems, and analysis of customer interaction in any product section and the product-wise conversion rate, helped improve product placement and cross-selling across product categories.
  • The client saw a solid 15% increase in customer satisfaction post implementation of Indium’s solution.
  • The easy to use User Interface enabled all stakeholders to get a better insight into customer’s behavioral as well as shelf zone analysis.
  • 70% cost savings in short and long-term as the tools used in the project were open source.
  • Highly comprehensive dashboards were built with real-time data refresh, tailored to the client’s analytical and business needs.

How Indium can help you start a 100% Remote Mendix Project

Remote working has become especially hot in times of Covid-19, but collaboration between teams distributed across geographies is nothing new. Many platforms enable this.

What makes the low-code platform Mendix unique is it is a high productivity app that allows speeds not seen in the traditional development platforms. Mendix development has several in-built features which make this acceleration of development projects possible.

Next Gen Application Development at your fingertips!

Read More

We list a few advantages of Mendix Project platform:

  • Agile development requires third party storyboards that need to be integrated into the development process. In Mendix, the storyboarding feature is in-built, enabling creation and modification of storyboards on the fly.
  • For requirements management, though there are forums for traditional development platforms, they are not really official and scattered across the World Wide Web. Mendix has a closed house, an official forum where solutions can be accessed and fetched and collaboration with other developers is possible.
  • The reusability of components is high. In traditional development, component development takes effort and needs to be stored in a JAR. Here, the artefacts can be convened and stored in the forum or a community that is public or private. Across applications, these components can be leveraged as a widget or a module.
  • UI is another area where Mendix scores high. SAP has a UI layer called Fiori where most of the SAP applications are built and available only for SAP users. Mendix provides a readymade Fiori framework for SAP integration. Even if the developer has no experience with SAP he can incorporate this for an SAP application.
  • Any third-party integrations can be done easily with theavailable artefacts, making the process very simple.
  • Mendix deployment is also simple as it comes as a containerized stack. In other development environments, the developer faces the challenge of sizing the application. If there are n-number of users, the developer must work out the size of the application. In Mendix, at the time of enabling the required number of licenses, their container size is automatically determined. The entire deployment process is also automated and simplified.

Advantage Indium

Indium Software has two decades of experience in working with cutting edge technologies building teams and team skills to keep pace with the latest solutions. Keeping in line with equipping our teams with the required qualifications to leverage the Mendix platform, Indium encouraged its developers to undergo certifications three to four years ago.

Today, we have 56 developers certified in Basic; 25 are undergoing the intermediate level between Rapid and Advanced certifications, and 5-10 are on their way to Advanced-level certifications.

Check out our articles on Mendix

Read More

Indium takes advantage of the Forums to further advance its understanding of the platform and also take help from Mendix if there is a new area that needs their inputs. Developers can approach Mendix with queries that they respond to. It is a matter of pride for Indium that the developers were able to create innovative Proof of Concepts using Mendix with Webex Applications and HL7 engine that got Mendix interested to know more.

Indium has other such firsts including integrating with Kafka for real-time processing, iFrame for Tableau or PowerBuilder, LDAP integration, created a connector for ViDM  SSOpl among others.

Bluetooth scanning, QR code scanning and biometric authentication are some of the application areas where this has been used innovatively. Indium also has expertise in Mendix QA automation.

Indium has leveraged Mendix for across domains including Healthcare, Manufacturing, Realty, Financial Services, Retail and Government.

Leverge your Biggest Asset Data

Inquire Now

With such rich experience and cross-domain expertise, Indium is an ideal partner for enabling remote Mendix project management. If you have a requirement, call us now.

Federated Learning with TensorFlow

Federated learning is a machine learning setting that enables multiple parties to jointly retrain a shared model without sharing the data among them. This has advantages in areas like increased intelligence scale and privacy.

In some plain English, this Federated Learning is the way of learning for machines without even seeing the data, so you don’t have to fear your privacy is being stolen for the sake of better service.

Decentralised Data and Intelligence

A large number of data nowadays are born in decentralised silos like mobile devices, IoT devices. Making a collective intelligence from this decentralised data in a traditional way and has always been a challenging one.

Why is it Challenging?

Traditionally the individual data sources will be transmitting the generated data to a centralised server and the machine learning algorithm will be trained on the collected data and the intelligence is now centralised. All the clients will be making some request calls to the central model along with the data, in order to unlock the hidden knowledge present in the data.

Check out our Machine Learning and Deep Learning Services

Read More

What is inherently challenging in this era of decentralised data is

  1. The sensitivity of the data
  2. High latency

The Sensitivity of the Data

The data which these  isolated data generators are creating might inherit some of the user’s personal data, which  they don’t want to share to the world even for any charitable reasons.

Example: A hospital will not share the patient data with any other organisation, a friend of yours won’t share his personal text messages with you

High Latency

Applications which need the low turnaround time cannot benefit from this model of intelligence, as this network calls bring along with them the high latency to unlock the intelligence.

Machine Learning in the Era of Decentralised System – Federated Learning

Federated Learning as like the term goes:

“A  set up as a single centralized unit within which each state or division keeps some internal autonomy.

Thus all the participants will be acting together as a group where each one of them is having its own rights to contribute and utilize, thus preserving the decentralised nature on a whole.

How Federated Learning Answers the challenges of Conventional Machine Learning

This Federated learning approach has major components like

Let us see in a stepwise manner how it works along with these components

  1. Initial Model
  2. Local Training on clients
  3. Encrypted communication of weights to the server
  4. Aggregation over Encrypted Values.

Initial Model

The model owners will be preparing the initial models and based on random initialization of the parameters

Local  Training on Clients

Each client will be training the model on the set of data they have created. This training will not be for convergence and will be done for some iterations to capture the initial weights. At the time of training not every client device is made to train, the devices which satisfy several conditions will only be used for training.

As in the case of mobile devices  they can participate only in training when they are plugged in a power source or when a wifi connection is made available.

Encrypted communication of  weights to the Server

Once after the training is done the new weights will be communicated to the server for aggregation.  The model weights are being encrypted before sending out to the servers.

Aggregation in the Server

The server does all the aggregation on the encrypted data and the new model is created as an aggregate of all the clients which are participants of the training process.

This cycle of to and fro communication and weight updates are being carried out in order to reach out a convergence at the server.

A practical Walkthrough in Tensorflow

Let us use this framework of federated Learning applied to train a deep neural network. Let us take the example of the MNIST data set and have a step by step process of creating the Federated model.

Step 1: Setting up the environment

Verify the environment is being set up in order to follow the tutorial. If you can see a welcome message you are good to follow

Step 2: Data Preparation and Client creation

Federated Learning requires data to be prepared in a federated format. In this tutorial since we are simulating the participants of our federated training, we are good to use the simulated dataset available along with TFF.

This simulated dataset is splitting the data and creating a virtual federated client around the data (accessible through client_id)

Load the data from TFF  simulation and check the value types

Thus we can see the data types are having the X and Y labels marked as ‘pixels’ and ‘labels’

Let us prepare the dataset for making the flatten objects out of the images and prefetch the datasets for the next batch.

Data-Preparation-and-Client-creation-step-2

Step 3: Model Creation

We are creating a simple deep learning model with a single layer and a  Softmax layer to classify the images.

Model-Creation-Step-1

This model function is our normal tensorflow model, in order to use the same for the  federated learning purpose TFF provides a wrapper class  which can take in the model and a dummy sample data to wrap it for our purpose.

Step 4: Training the federated Model

Training or the learning in a federated setup is done with the help of the tff.learning.build_federated_averaging_process   function, which takes 3 parameters

  1. Model wrapped inside tff wrapper
  2. Client optimiser
  3. Server optimiser

Iterating over the data and aggregating on the server for several rounds of data will lead to better convergence of the data.

Leverge your Biggest Asset Data

Inquire Now

Let us run the same steps of iteration for over a period of 1000 iterations in the randomly sampled client devices leads to the improvised accuracy

Logging the data capture across every training iterations and visualising over the tensorboard we can see the loss declining smoothly.  

Results of the training can be seen in the graph.

Future Works:

With the highly changing landscape, the TensorFlow and federated learning have been rolling out a lot of updates every week. In the continuous weeks, we can see how can we use any pre-trained model in the federated learning setup. How to use the tensorflow_encrypted to carryout encrypted computation among other model types other than deep learning can be studied.

Analytics in E-commerce and Indium’s Expertise

The global e-commerce analytics market is expected to generate US$22.412 billion by 2025 as against from US$15.699 billion in 2019, growing at a CAGR of 6.11 per cent, according to ResearchAndMarkets.com’s report ‘Global E-Commerce Analytics Market – Forecasts from 2020 to 2025’.

Some of the key drivers will be the increasing disposable income that has led to an improving purchasing power of people. The convenience of ordering products online on e-commerce platforms and retail stores will further stimulate market growth.

To meet this growing demand and understand its customers better, e-commerce businesses are increasingly investing in advanced business intelligence and analysis tools. This can provide insights into which products are moving fast, in which markets and how to improve their operations to service the customers better, maximize profits and gain a competitive edge.

3 Focus Areas

E-commerce analytics falls into three main areas:

  • Data Visualization and Descriptive Analytics: Dashboards created using historical data of customer behaviour and sales records provide snapshots of all key metrics for improved decision making
  • Predictive Analytics: Using churn prediction, market-basket analysis and the like, e-commerce marketplaces can predict the demand for products and design promotions to cross-sell and upsell for improving sales and customer engagement
  • Cognitive Analytics: Video and images are analysed for product classification based on predefined parameters to quickly upload new products and avoid errors and time delays associated with manual intervention

Challenges and Benefits

For e-commerce platforms and online stores of retail outlets, an understanding of which products are moving, where their customers are coming from and what their customers are saying are very important.

When a product is performing well, they can boost it further by creating suitable marketing collaterals and also pair it with likely related products to increase the overall sales and growth.

Webinar on How to Jumpstart the eCommerce Journey

Watch Video

A product which is not performing well will need equal efforts to promote and special offers and discounts to increase its visibility can be designed to improve its sales.

Based on geographies, retail businesses can also plan their campaigns for their stores in those locations and step up promotions for those geographies where they have a presence but not as many footfalls.

Machine learning and artificial intelligence can be used for cross-selling and upselling of related products. For example, when someone is purchasing a mobile, relevant accessories can be displayed to encourage customers to purchase a mobile case or headphones, and so on. When a customer purchases a particular model, they can be tempted with a higher model with better and more features.

Analytics can also be used to understand conversion rates from footfall to sales and the insights used to improve the conversions. Reviews, both positive and negative, are a storehouse of information on what works and what doesn’t.

Negative Review Analytics helps to build the product line with quality to meet customer expectations. Sentiment analysis allows the e-commerce players to build on their strengths, rectify their weaknesses and retain the unsatisfied customer.

For instance, in an e-commerce site, a particular bag was very popular but soon, negative feedback started pouring in. On analysis, it was discovered that the bag was still good but a flap that was added as a design element was made of a different material that did not last long as expected. This is valuable input for the e-commerce marketplace as well as the manufacturers to improve.

Competitor analysis can also be used to devise marketing and, more importantly, pricing strategies to improve the edge over business rivals. Marketplaces and FMCG can especially benefit from this.

Use Cases

Indium used sentiment analysis for a sports retailer where the reviews were analysed to understand customer perception and feedback of the products. Indium’s proprietary data extraction tool, Tex.Ai enabled extracting key phrases to gain insights on customer views. This helped the sports retailer improve on its design and customer service.

For an e-commerce aggregator, Indium used teX.Ai to automate product classification.

Chats with customers, either on chatbot or by a customer executive over the phone can be another rich source of insight into customer satisfaction levels. Using data extraction, the discussion can be analysed for what the customer needs, how it was responded to and if it had been concluded to satisfactorily. This is crucial in building customer loyalty and training the executives and the chatbots to ensure there is a closure.

Analytics can also be used for resource optimisation to reduce the waiting time of customers trying to reach a representative.

Leverge your Biggest Asset Data

Inquire Now

Indium Advantage

Indium Software, in its more than two decades of existence, has been providing holistic solutions on cutting edge technologies. It has carefully built a team that is a judicious mix of domain and technology experts.

Our e-commerce team can set up and run a marketplace from the ground up using the latest technologies including in-build analytics. It can also build solutions for analytics on existing platforms using machine learning and artificial intelligence. Strong solution architects, subject matter experts and expertise in analytics make Indium an ideal partner for e-commerce platforms and retail brands seeking to leverage the World Wide Web.

Penetration Testing on Cloud Environment – Important Things to Consider

Technically, a penetration test on the cloud computing environment does not differ that much from any other penetration test, even an on-premise equivalent.

You may have moved data to the cloud. But that doesn’t mean your responsibilities for securing it are gone.

In a hybrid cloud environment, where some data is stored locally while some lives in the cloud, security must be assessed wherever information resides.

Penetration testing probes for weaknesses that could compromise security, perhaps leading to a data breach.

When your organization stores sensitive information on behalf of customers, like medical or financial records, you are not just responsible for protecting their data; you also must ensure that all of your outsourcing venues are following proper protocol.

How is a typical pen test carried out?

Pen tests start with a phase of reconnaissance, during which an ethical hacker spends time gathering data and information that they will use to plan their simulated attack.

After that, the focus becomes gaining and maintaining access to the target system, which requires a broad set of tools.

Tools for attack include software designed to produce brute-force attacks or SQL injections.

There is also hardware specifically designed for pen testing, such as small inconspicuous boxes that can be plugged into a computer on the network to provide the hacker with remote access to that network.

In addition, an ethical hacker may use social engineering techniques to find vulnerabilities.

For example, sending phishing emails to company employees, or even disguising themselves as delivery people to gain physical access to the building.

The hacker wraps up the test by covering their tracks; this means removing any embedded hardware and doing everything else they can to avoid detection and leave the target system exactly how they found it.

What happens in the aftermath of a pen test?

After completing a pen test, the ethical hacker will share their findings with the target company’s security team.

This information can then be used to implement security upgrades to plug up any vulnerabilities discovered during the test.

These upgrades can include rate limiting, new WAF rules, and DDoS mitigation, as well as tighter form validations and sanitization.

Breach of Security or not

Our Security Testing Services are a must

Read More

Challenges of Cloud Pentesting

In the past, testing of cloud-based applications and infrastructure was somewhat restricted because of legal and geographical complications.

Security enthusiasts and professional penetration testers were not permitted to perform intrusive security scans or penetration tests on cloud-based applications and environments without the explicit permissions of Cloud Service Providers like Microsoft Azure and AliCloud.

But the growing number of cyber attacks targeting the cloud in recent years is paving the way for mainstream cloud computing penetration testing.

The recent CapitalOne data breach showed that a misconfigured access control (IAM) configuration on AWS was enough for a malicious attacker to obtain adequate credentials to illegally access Amazon S3 buckets and retrieve the information stored within.

Organizations are now open to QA outsourcing to conduct penetration tests on their cloud environments under controlled circumstances.

But before going deep into what a cloud environment pentest entails, it pays for users to understand that security of the cloud is a shared responsibility.

Cloud service providers like Amazon Web Services (AWS) inherently build security in their infrastructure.

Cloud firewalls such as Security Groups are configured by default to disallow all traffic unless otherwise specified by the user.

It is this user flexibility that is ballooning the risk of human error in the cloud.

If end users accidentally switch a configuration like removing a Security Group whitelist to a VPN or internal IP, they open up their cloud infrastructure and applications to a larger attack surface.

Pen-testing on cloud environment – The Execution

1) Understand the policies of the cloud provider

Putting private clouds aside, for now, public clouds have policies related to pen-testing.

In many cases, you must notify the provider that you’re carrying out a test, and it puts restrictions on what you can actually do during the pen-testing process.

So, if you have an application that runs on a public cloud and would like to pen test it, you’ll need to do some research first regarding the process your cloud provider recommends.

Not following that process could lead to trouble. For instance, your pen test will look a lot like a DDoS attack, and it may shut down your account.

All cloud providers proactively monitor their infrastructure for anomalies. In some cases, humans may give you a call to find out what’s up.

In most cases, cloud service providers have automated procedures in place that shut down the system without warning when it perceives a DDoS attack.

You could come into the office the next day and find that your cloud-delivered storage systems, databases, and applications are offline, and you’ll have some explaining to do to get them back up and running.

Is Your Application Secure? We’re here to help. Talk to our experts Now

Inquire Now

The long and short of this is that there are rules of the road when it comes to public clouds.

You have to understand the legal requirements of the pen testing, as well as policies and procedures, or else you’ll quickly find yourself off the cloud system.

2) Create a pen-testing plan

Those who plan to do a cloud application pen test first need to create a pen-testing plan.

The test plan should be agreed to by the pen-testing team, and each part of the plan should be followed. Any exceptions that occur are really part of the results, such as an application admin seeing the pen test occurring and killing access for the pen-testing team.

3) Select your pen-testing tools

There are many pen-testing tools on the market. While pen testing cloud-based applications with on-premises tools is a popular approach, there are now cloud-based pen-testing tools that may be more cost-effective.

Moreover, they don’t require huge hardware footprints. It’s a cloud pen testing a cloud. What’s important about the tool is that it can simulate an actual attack.

In Summation

Pen testing is not an option these days. It’s the only way to prove that your cloud-based applications and data are secure enough to allow the maximum amount of user access with the minimum amount of risk.

Striim-Powered Real-Time Data Integration of Core Banking System with Azure Synapse Analytics

Cloud-based technologies such as Azure Synapse data warehouse, formerly MS SQL, enable banks to leverage their analytical capabilities to get insights that can help with operational decision making on a continuous basis.

It allows querying data as per the bank’s requirements and brings together enterprise data warehousing and Big Data analytics. Based on these insights, banks can devise strategies for improved efficiencies in operations and development of products for better customer service.

Striim for CDC

A platform such as Striim enables the transfer of data from heterogeneous, on-premise data warehouses, databases, and AWS into Azure Synapse Analytics with in-flight transformations and built-in delivery validation. This helps with operational decision making on a continuous basis.

For the exercise to be really fruitful in today’s world of instant response, it is necessary for the data being transferred to be as current and close to the source database on the core banking system. A platform like Striim enables this data integration from the source table to the target using Change Data Capture (CDC).

Learn more about Indium’s Strategic Partnership with Striim

Learn More

CDC allows for data from on-prem sources, regardless of whether it is an RDBMS, No-SQL, or any other type, to a Synapse table or ADLS Gen-2 (Azure Data Lake Store Generation – 2) to be created and updated in near real-time. It doesn’t hit the source database directly.

Instead, it captures all the transactions, be it an update, insert, or delete, from the source database on-prem on a daily basis from the log and gets it generated and duplicated on the target database

This way, the performance of the source database is not affected while there is access to data on the cloud in near real-time for analysis and response.

Advantage Striim

One of the factors that make Striim the most desired CDC tool is its price point while being feature-rich. An evolving tool, it also allows for features such as UDF (User Defined Function) that can be plugged in on the fly. It allows for data manipulation and querying based on the unique needs of the bank. The icing on the cake is the reporting feature with live dashboards and a diverse set of metrics for effective data monitoring.

Its built-in monitoring and validation features include:

  • Ensure consistency through continuous verification of the source and target databases
  • Enable streaming data pipelines with interactive, live dashboards
  • Trigger real-time alerts via web, text, email

By powering the data integration of the on-prem database of the core banking system with Azure Synapse using Striim, banks can ensure continuous movement of data from diverse sources with sub-second latency.

It is a non-intrusive way of collecting data in real-time from production systems without impacting their performance. It also allows for denormalization and other transformations on data-in-motion.

The data warehouses Striim supports include:

  • Oracle Exadata
  • Teradata
  • Amazon Redshift

Databases:

  • Oracle
  • SQL Server
  • HPE NonStop
  • MySQL
  • PostgreSQL
  • MongoDB
  • Amazon RDS for Oracle
  • Amazon RDS for MySQL

Striim can integrate data in real-time data from logs, sensors, Hadoop, and message queues to real-time analytics.

Leverge your Biggest Asset Data

Inquire Now

Indium – A Striim Enabler

Indium Software is a two-decade-old next-generation digital and data solutions provider working with cutting edge technologies to help banks and traditional industries leverage them for improving their business process, prospects, and efficiencies.

We can help identify the tables in the core banking system that need to be replicated on the target Synapse and set up the Striim platform for smooth integration. Leaders in implementing Striim, we have successfully lead several such integrations across sectors. Our team has cross-domain experience and technology expertise which helps us become partners in the truest sense.

If you would like to leverage cloud and analytics through Striim, contact us here: