Back to Blog

Building Intelligent Fashion Catalogs: A Guide to Implementing Vector Search

10 min read
Building Intelligent Fashion Catalogs: A Guide to Implementing Vector Search

Building Intelligent Fashion Catalogs: A Guide to Implementing Vector Search

Keyword search is broken for fashion. If a user searches for "long flowy dress for a summer wedding", a traditional SQL LIKE query or even ElasticSearch often fails. Why? Because the product description might say "Maxi gown, floral print, lightweight fabric". The keywords don't match, even though the intent is a perfect fit.

This is where Vector Search (Semantic Search) changes the game. By converting your product catalog into mathematical vectors, you can search by meaning, not just syntax.

In this guide, we'll build a simple semantic search engine for a fashion catalog.

The Stack

  • Database: Supabase (PostgreSQL with pgvector extension).
  • Embeddings: OpenAI text-embedding-3-small (Fast and cheap).
  • Framework: Next.js.

Step 1: Enable Vector Support

First, enable the vector extension in your Supabase SQL editor:

create extension if not exists vector;

Then, create a table to store your products and their embeddings. Note the embedding column using vector(1536) (standard for OpenAI models).

create table products (
  id bigint primary key generated always as identity,
  title text not null,
  description text not null,
  image_url text,
  embedding vector(1536)
);

Step 2: Generating Embeddings

When you add a product to your catalog, you need to generate its vector representation. We'll combine the title and description into a single string to capture the full context.

import OpenAI from 'openai';
import { supabase } from '@/lib/supabaseClient';

const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

async function addProduct(title: string, description: string) {
  // 1. Create the text to embed
  const content = `Title: ${title}\nDescription: ${description}`;

  // 2. Generate embedding
  const embeddingResponse = await openai.embeddings.create({
    model: 'text-embedding-3-small',
    input: content,
  });

  const embedding = embeddingResponse.data[0].embedding;

  // 3. Store in Supabase
  const { error } = await supabase
    .from('products')
    .insert({
        title,
        description,
        embedding
    });
}

Step 3: Searching by Meaning

Now for the magic. When a user searches, we don't look for keywords. We:

  1. Convert their search query into a vector.
  2. Find the mathematically closest product vectors in our database (Cosine Similarity).

First, create a Remote Procedure Call (RPC) in PostgreSQL to handle the math:

create or replace function match_products (
  query_embedding vector(1536),
  match_threshold float,
  match_count int
)
returns table (
  id bigint,
  title text,
  description text,
  similarity float
)
language plpgsql stable
as $$
begin
  return query
  select
    products.id,
    products.title,
    products.description,
    1 - (products.embedding <=> query_embedding) as similarity
  from products
  where 1 - (products.embedding <=> query_embedding) > match_threshold
  order by similarity desc
  limit match_count;
end;
$$;

Step 4: The Frontend Search

Finally, call this function from your Next.js API route or Server Action:

export async function searchProducts(userQuery: string) {
  // 1. Embed the user's query
  const embeddingResponse = await openai.embeddings.create({
    model: 'text-embedding-3-small',
    input: userQuery,
  });
  
  const queryEmbedding = embeddingResponse.data[0].embedding;

  // 2. Call the RPC function
  const { data: products } = await supabase.rpc('match_products', {
    query_embedding: queryEmbedding,
    match_threshold: 0.5, // Adjust based on strictness
    match_count: 5
  });

  return products;
}

The Result

Now, a search for "outfit for a rock concert" will return your black leather jackets and distressed denim, even if the word "rock" never appears in their descriptions. The vector understands the vibe of the clothing matches the context of the event.

Want to go deeper?

Vector search is just the beginning. We can help you implement Multi-Modal Search (search by uploading an image) or Hybrid Search (combining vectors with keyword filters for size/color).

Check out our technical case studies to see how we reduced "zero result" searches by 80% for a major fashion retailer.