AI + Behavioral Science
Since 2019, Yobi’s team has focused on tackling the most fundamental challenge related to AI/ML development: building predictive feature sets from high quality data. Recent studies have highlighted the lack of high quality data to support sophisticated AI and ML development.
Yobi makes the most of its consumer behavior dataset by using innovative machine learning methods combined with behavioral science to create a clear picture of consumer behavior, making the same kind of cutting-edge research used by large technology companies available to all AI/ML Teams.
Machine learning models represent complex data by mapping it into a high-dimensional space, a process known as creating “embeddings”. For example, large language models like GPT-4 create embeddings for sentences, representing each sentence as a point in space. Likewise, image generation models such as DALL·E create embeddings for images. Yobi uses similar technology to create embeddings for consumers, capturing the behavior of those consumers via their location in a high-dimensional space. Consumers who are more similar in their behavior are closer together, and consumers who are more different are further apart.
The value of embeddings is that they focus on the signal in the data, and discard the noise. The entire history of an individual consumer can be reduced to a few hundred numbers that convey all the information needed to make good predictions about that consumer’s future behavior. The embedding pulls out just the information that is relevant to prediction, making it more efficient to store and use than raw behavioral data, with none of the associated privacy risks.
Large technology companies use these embeddings to power their own knowledge graphs. However, those knowledge graphs can only be used within their own ecosystem. By developing our own independent Yobi Knowledge Graph, Yobi aims to democratize this kind of capability to all. This Knowledge Graph powers all our products, allowing Yobi customers to produce extremely accurate models for targeting and personalization.
A Foundation Model
Modern AI has gravitated towards creating massive models that produce embeddings that can be widely used for other tasks. For example, the embeddings from GPT-4 can be used to create machine learning models to solve specific problems in natural language processing. Training GPT-4 was expensive, so rather than retraining the model every time we have a new problem to solve we use the embeddings from the pretrained model. Models like this are called foundation models, as they provide a foundation on which other machine learning models can be built.
Yobi has created a foundation model for consumer behavior. By training on our massive dataset, we have produced embeddings that AI/ML Teams can use to train their own high-performance machine learning models to solve the specific problems that their businesses face.