next click recommender
Helping an e-commerce company nearly triple their add-to-cart conversion rates.
Our client’s low conversion rate was primarily a navigation problem: web visitors often abandon their on-line shopping activity because it takes too long to navigate through the site before coming across a right product or service.
Collaborative filtering recommendation systems are typically used by online retailers to increase conversion and boost revenue. However, due to the fragmented and highly-specialized nature of our client’s product line, these recommenders failed to detect individual customer’s preferences and ended up producing most trivial recommendations (such as the overall most popular products). Additionally, there was no suitable infrastructure for stable collaborative filtering: user comments and ratings were sparse and mostly anonymous as the site did not require user login.
Instead of relying on other users’ collective feedback, our solution focuses on the current web visitors and their activity in real time. The key premise is that individual sequences of web browsing events can serve as a “fingerprint” indicative of personal preferences and shopping behaviors. A master model which was trained on historic event logs using a combination of process mining techniques predicts the most optimal path to selecting a product and adding it to cart and suggests the next click to the prospective customer. The model is self-learning and continually updated so that its prolonged usage increases its precision and overall value.
automated patent analysis and mapping
Custom software tool for intellectual property mapping and automated patent analysis for a semiconductor technology startup.
The amounts of patent information are growing at an ever-increasing rate. The number of patents annually issued by the US Patent and Trademark Office nearly doubled from 2005 to 2015. Processing large volumes of information cognitively and with a high level of detail by technical subject matter experts makes the process of building and maintaining an IP map more and more labor-intensive (i.e., expensive).
Our client was looking for a software tool to automate patent document ingestion and data processing to produce detailed summaries of patent claims, understand what technical areas were already too crowded while others had a lot of “white space” (and why?), understand and internalize prior art, learn from existing knowledge and avoid possible infringement. At the core of our solution was a suite of NLP models that would search for and expose key phrases, specialized terms and jargon, inventor and assignee names, and patent claim features. These models were trained with client's needs in mind: we had many live discussions with subject matter experts to sharpen model precision and attune it to the unique needs of our client.
The key features of our product include:
- Automated ingestion, aggregation and summarization of patent documents. This is where unstructured text is converted into a structured data set.
- Dynamic patent landscape map showing the relationship and affinity among various key phrases and technical terms. Typically, this map is highly non-homogeneous showing clusters and white space thus indicating areas of overcrowding vs. opportunity for novelty.
- Automated derivation of technical summary tables. This feature answers deep technical questions, such as, what compositions and property ranges have been covered in patent claims and examples. When detecting property ranges, they will also be transformed to the same units.
- Inventor and assignee identity resolution algorithm for the numerous cases where company and inventor names have several different spelling variants in different patents.
sales monitoring and forecasting with Bayesian Nets
We helped a global technology company save $5M in operations cost.
A sales process is a system that progresses through a sequence of stages, from initial prospecting towards its goal of closing the deal. Each stage is described by a set of attributes, such as, duration, deal value and probability of successful deal closing. The attributes of each subsequent stage largely depend on the attributes and events at the preceding stages. In this study, we applied Bayesian Networks to predict the probability of success, estimate time to closing and final value of a sales deal.
Our customer is a global technology company which has recently started a new line of business offering their products packaged as enterprise services, and has already built a large sales pipeline. Their traditional sales management process is largely intuitive and experience-based, supported by rudimentary reporting analytics. Selling new type of product comes with a learning curve, so the experience required for intuitive decision making simply wasn’t there, and additional insights were needed.
We built a solution keeping in mind that our customer wanted to introduce minimum disturbance to their existing processes. The front end was rather simple: each deal was assigned a daily-updated success score, a composite KPI computed using the Bayesian Networks model and describing the probability of reaching the closing stage within specified time limits. The deals whose success score fell below the threshold value were marked with a big red flag, so that:
- The executives knew exactly when and where their corrective feedback was needed rather than conducting frequent reviews of the entire pipeline;
- The sales team was focused on the right tasks with high probability of success and not waste time on the “hopeless” deals.
As an bonus feature, our probabilistic model offered some insights into sales force behavior patterns. One interesting observation that might require further action was detecting a human error factor in populating the sales database and its possible deviation from ground truth.