AI trained on your Content

With the Gitana platform, the goal is deliver content that powers intelligent endpoints and customer touchpoints. This includes training AI models to support your live applications and points of customer interface.

We offer design and development professionals that will work to augment your team as you seek to introduce trained AI models into your business. We will work with your team from the discovery phase through to the development and scale-out of your offering.

We will also guide you through the usage of the Gitana Platform for storing content structures, graph relationships, automation and the iterative training of your AI model.

AI Model Use Cases

  • Domain Expertise

    AI models are trained on your content, learning its structure and nature to gain an understanding of your business, brand and products.

  • Sales Optimization

    AI models are calibrated to your sales objectives, active promotions and product placement goals.

  • Customer Needs

    AI models provide real-time inference to support recommendations and personalized decisions that support your customers as they make buying decisions.

AI Providers

Connect your content publishing and workflow processes to Machine Learning / AutoML providers for automated training, validation and fine-tuning of your content set and AI models ahead of release.

Gitana features pre-built integrations to popular providers or you may connect to your own self-managed pipelines.

Google Vertex AI

Google Vertex AI

Azure ML

Azure ML

AWS ML

AWS ML

OpenAI

OpenAI

Watson ML

Watson ML

Train your AI Models

Generate data sets and stream them into a pipeline to seamlessly train your AI models as experts in your business, your content and your brand.

Trained AI models optimize the customer experience by providing personalized and goal-optimized inference services.

Here are a few examples of AI models that support inference at the edge.

Predictive Recommendations
Predictive Recommendations

Predict what your customers will want to see next. Show what's most relevant based on where they are in their journey, based on contextual navigation, from pre-purchase to post-purchase.

Personalization
Personalization

Using real-time customer data and training guidance, create dynamic and custom navigation experiences, delivering pages and components that are most relevant for each unique customer.

Chat Bots / Assistants
Conversational

Provide better answers for your customers as they interact with ChatBots, assistants and other natural language processing interfaces. Improve accuracy as these models parse, interpret and respond to customer questions.

Personalized Search
Personalized Search

Find it Faster. Using intent-based detection, users search not only on words but also on the intent in their language and behavior to yield instantly relevant results.

Tagging
Intelligent Tagging

Automatic application of tags to content based on the text, context and intention of the document. Tags allow for automatic management of grouping and relatedness between documents.

Taxonomy / Classification
Taxonomy / Classification

Classify your content according to Taxonomy structures and rules. Group, tag, apply hierachical relationships and apply aspects to inject metadata and behaviors onto your content.

Content Generation
Content Generation (Preview)

Spin up private and secure Large Language Models (LLMs) to provide content generation, summarization, extraction and synopsis using Generative AI. Train and fine tune the LLM on your content, your brand and your messages.

>Questions and Answers
Questions and Answers

When a user types in a question, utilize auto-completion and contextual awareness to instantly serve up one or more responses they need. Provide better customer support with the right answers in fewer clicks.

Automatic Generation of Datasets

Incrementally build and augment your training and validation data sets whenever content changes in your branch. Execution scripts run on the affected content and produce increment data sets that are submitted to your machine learning pipelines.

  • Data Set Generation

    Background jobs execute the generation of both training and validation datasets using server-side scripting. Scripts query your content and fetch relational structures from your content graph. Data is normalized and mapped into training completion rows in JSON, JSONL or other required formats.

  • Incremental

    As new data appears in the content graph, it is incrementally fetched and compiled and then aggregated into the total dataset. This makes dataset augmentation fast and allows for incremental training and fine-tuning of your AI models as new data is made available.

  • Training and Validation

    Training data sets are maintained distinct from validation data sets, allowing for both tuning of your AI models and validation of those models with scoring and feedback needed to calibrate the content model and data set generation heuristics.

Machine Learning Pipelines

Train and validate your AI models using your approved content with continuous training. Datasets are generated and submitted to Machine Learning pipeline providers for integrated testing and validation of your models. Scored results are returned and acted upon to improve your content set, training scripts or graph heuristics.

  • Training ahead of Release

    Publishing workflows allow you to schedule content for a future release. Ahead of this release, pre-train a new revision of your model using approved content. Then, on launch, swap the new AI model in for the old one to leverage the new set of inferences.

  • Distributed Training Execution

    Training and validation runs as a distributed background job. Keep track of progress and receive notification when training completes. Retrieve results, act on them or advance the publishing workflow for iteration.

  • Branch-based, A/B Testing

    Training and validation works hand-in-hand with branches. Create a new branch of content changes and train a branch-specific revision of your AI model with your new content changes. Then dynamically switch models within your app to perform in-application A/B testing to determine which models yield a best fit.

Fine Tune for Continuous Learning

Validation results provide insights into how well your trained AI model's predictions correlated with what you expected. Improve your model's accuracy by fine-tuning your content, graph relationships, weights and training scripts.

  • Scoring

    Having trained your model with the training dataset, the validation dataset yields test results to indicate how well your model is trained. See what tests failed and the overall percentage score for your validation run.

  • Fine Tune

    Adjust your content properties or weighted graph relationships to tighten the relationships between instances. Optimize your content so that preferences are given to the kinds of outcomes that you expect upon AI model inference.

  • Iterate

    With your optimizations in place, re-run the training with a new version of your AI training model. Compare scores to see what improved and whether further investment is needed.

Consulting Services

Our engineers will instantiate, configure and host your live AI models to support real-time training and delivery of smart inference services for your applications and APIs.

Machine Learning

Machine Learning

Analyze customer interactions and touchpoints with your data to develop a more effective way to support customer decisions and needs at the time of purchase.

Natural Language Processing

Natural Language Processing

Parse and understand text and perform speech recognition, sentiment analysis, text summarization and transformation into new languages and for new audiences.

Neural Models

Neural Models

Solutions built on top of neural networks and trained on labels and classifiers using self-supervised learning and feedback loops.

Large Language Models

Large Language Models

Create a wide variety of data, such as images, video and text by learning patterns from existing data, then using this knowledge to generate new and unique ideas.

Discovery and Design Workshop

At the onset of an engagement, we will sit down with your team through a Discovery and Design Workshop. The goal is to understand your data, your objectives and to explore ways where AI models could fit in to support those objectives.

  1. Gather

    We gather information on your data and your AI objectives.

  2. Ideate

    Collaborative sessions focused on identifying use cases specific to your needs and objectives.

  3. Prioritize

    Refine and determine the more important use cases.

  4. Deliver

    We produce a report describing an implementation plan for delivering on your desired use cases.

We will work with your team, no matter where they are on their AI journey, to bring these use cases to life.

Proof of Concept / Pilot

With the use cases in hand, we now focus on the implementation of a minimal viable product (MVP) or Proof of Concept (POC) that demonstrates that value add of the AI model or models.

This phase focuses on building the AI models and engineering pipelines to support the implementation. The expectation is that this will be highly iterative. We use rapid prototypes to quickly turn around working examples that demonstrate the training and usage of the trained model.

We will work with your team through acceptance testing. We then pilot these models to test and validate resiliency in a real world setting.

At the outcome of this phase, we will have a working model and prototype. We will also provide guidance on lessons learned and a report on how to proceed ahead on the path to roll-out.

Development and Scale

Our engineers are available to augment your team for any training or development work required to move the implementation to production

We will build on the lessons learned from the pilot to advance the quality and readiness of the solution for go-live. These objectives focus on scaling up the solution for real-time usage or larger request loads.

In addition, our engineers are available to provide developer assistance and architectural guidance as needed.

Ready to Get Started?

Unlock your data with smart content services and real-time deployment


Try Gitana for free Ask for a demo