How to recognize furniture items with vision AI (2024 guide for a Python FastAPI app)

June 18, 2024

In our last guide, we showed you how to use the Dragoneye API to build a React app that could recognize clothing items in detail from images. We covered the process of setting up the scaffolding for the React app, installing and integrating the Dragoneye Node.js package, and designing a simple UI where users could upload an image and see the results.

This week, we’re shifting gears to focus on how to use the Dragoneye API in a backend service that powers our frontend application. Specifically, we'll implement a mini backend service for a hypothetical furniture planning app. This app will use object detection and image recognition to help users find furniture and home decor for their new homes.

Our backend service will need to take an image from the frontend, call the Dragoneye API, and then return the types of furniture in the image (e.g., sofa, table, chair) along with the specific features of each item.

Why Proxy API Calls Through a Backend Service?

Before jumping into the implementation, it's useful to understand why calling external APIs from a backend service, rather than directly from the frontend, is beneficial.

1. Keeping API Keys Secure

Many external APIs require you to authorize your calls with an API key (token). This key is used to uniquely identify and authenticate you as the caller so that you can access the API. 

If these calls are made from the frontend, the key must be included in the client code, making it susceptible to being found and used without permission.

While it’s tempting to think that a well-obfuscated API key is protection enough, hackers have an ever-growing suite of tools to find keys wherever they are. And when stolen, hackers can use your API keys and run up a large tab before you even realize it.

By proxying the API requests through your own server, you can keep your API keys secure since they never leave any device that you don’t control!

2. Offloading Processing to the Backend

API responses often require processing, such as computing new values or fetching additional resources. Doing intensive processing on the client can burden the user's device and degrade the experience. Offloading this work to the backend, which can be scaled as needed, improves overall performance.

3. Simplifying API Updates

APIs evolve, introducing new features and changes. Adapting frontend code, especially in apps distributed through app stores, can be challenging as users might not update to the latest version quickly. 

From my experience at Meta, you’d be surprised how slowly folks adopt new versions of apps and how many are running versions that are more than a year old!

By handling API calls in a backend service, updates can be made centrally, without requiring changes to the client code.

Getting started with FastAPI

Now we have a better understanding of the advantages of proxying external API calls, let’s jump into creating a new backend service to proxy our Dragoneye API calls. For this, we’re going to spin up a backend service in Python with the FastAPI package.

FastAPI is an excellent choice for developing backend web services. It's easy to use, modern, and feature-rich, making it suitable for projects ranging from quick hobby projects to full-fledged production services.

Setting up the environment

To get started with FastAPI, let’s first set up a virtual Python environment. While there are many options for this, but my favorite is mamba. Check out their installation instructions: here

Next, create a new environment:

> mamba create -n dragoneye_fastapi_server python=3.11

You’re welcome to use any supported version of Python, which at the time of writing is Python 3.8+.

Activate the environment:

‍> mamba activate dragoneye_fastapi_server

Install FastAPI:

> pip install fastapi

Create a new folder for the project:

> mkdir dragoneye_fastapi_server
> cd dragoneye_fastapi_server

Create a file called main.py and add the following skeleton code:

from fastapi import FastAPI

app = FastAPI()


@app.get("/")
async def root():
    return {"message": "Hello World"}

And that’s all we need to get the minimal version of the server running. Let’s start it up:

> fastapi dev main.py

INFO     Using path main.py
INFO     Searching for package file structure from directories with __init__.py files

 ╭─ Python module file ─╮
 │                      │
 │  🐍 main.py          │
 │                      │
 ╰──────────────────────╯

INFO     Importing module main
INFO     Found importable FastAPI app

 ╭─ Importable FastAPI app ─╮
 │                          │
 │  from main import app    │
 │                          │
 ╰──────────────────────────╯

INFO     Using import string main:app

 ╭────────── FastAPI CLI - Development mode ───────────╮
 │                                                     │
 │  Serving at: http://127.0.0.1:8000                  │
 │                                                     │
 │  API docs: http://127.0.0.1:8000/docs               │
 │                                                     │
 │  Running in development mode, for production use:   │
 │                                                     │
 │  fastapi run                                        │
 │                                                     │
 ╰─────────────────────────────────────────────────────╯

INFO:     Uvicorn running on http://127.0.0.1:8000 (Press CTRL+C to quit)
INFO:     Waiting for application startup.
INFO:     Application startup complete.

Awesome! 

Interactive docs

One of the great features of FastAPI is that it automatically generates and serves interactive docs for your API without any additional work. 

Head to http://127.0.0.1:8000/docs, and you’ll see the generated docs. One cool thing about these docs is that they allow you to make requests to your API from within the docs themselves, making it super easy to test things out. 

Here is us giving the root path / a GET request, and receiving “Hello World” as a response. 

We’ll use this more in a second to test out the new methods we are going to write. 

Integrating the Dragoneye API

Let’s now install and write our first call to the Dragoneye API with the dragoneye-python package, which will allow us to use the Dragoneye vision understanding APIs easily. 

With the mamba environment activated, run:

> pip install dragoneye-python

With the package installed, we can go ahead and add the following code to main.py to create a simple proxy POST request. This request will take an image file, call the Dragoneye API, and return the response from the API directly.

from dragoneye import ClassificationPredictImageResponse, Dragoneye, Image
from fastapi import FastAPI, UploadFile

API_KEY = "YOUR_API_KEY_HERE"

app = FastAPI()


@app.get("/")
async def root():
    return {"message": "Hello World"}


@app.post("/predictions/furniture")
async def predictions_furniture(
    image_file: UploadFile,
) -> ClassificationPredictImageResponse:
    dragoneye_client = Dragoneye(api_key=API_KEY)

    prediction_result = dragoneye_client.classification.predict(
        image=Image(file_or_bytes=image_file.file.read()),
        model_name="dragoneye/furniture",
    )

    return prediction_result

Testing the API

Once you save, fastapi dev should automatically pick up the changes in the file and reload the server. Let’s head over to /docs to see what’s new!

As you can see, a new endpoint called /predictions/furniture has been added. 

Let’s now try it out with this image of a coffee table: 

We can submit the image under the image_file parameter in the interactive demo. 

Taking a closer look at the Response body:

{
  "predictions": [
    {
      "normalizedBbox": [
        0.21953125,
        0.32974601359439126,
        0.99921875,
        0.8612135024794803
      ],
      "category": {
        "id": 2924507580,
        "type": "category",
        "name": "coffee_table",
        "displayName": "Coffee Table",
        "score": 0.9835389852523804,
        "children": []
      },
      "traits": [
        {
          "id": 259598478,
          "name": "table/countertop_material",
          "displayName": "Table/Countertop Material",
          "taxons": [
            {
              "id": 4109072646,
              "type": "trait",
              "name": "wood",
              "displayName": "Wood",
              "score": 0.8274025917053223,
              "children": []
            }
          ]
        },
        {
          "id": 2773422134,
          "name": "base/leg_material",
          "displayName": "Base/Leg Material",
          "taxons": [
            {
              "id": 2773422134,
              "type": "trait",
              "name": "base/leg_material",
              "displayName": "Metal",
              "score": null,
              "children": [
                {
                  "id": 2108228504,
                  "type": "trait",
                  "name": "stainless_steel",
                  "displayName": "Stainless Steel",
                  "score": 0.7056350111961365,
                  "children": []
                }
              ]
            }
          ]
        },
        {
          "id": 1015792522,
          "name": "furniture_style",
          "displayName": "Furniture Style",
          "taxons": [
            {
              "id": 1250592018,
              "type": "trait",
              "name": "industrial",
              "displayName": "Industrial",
              "score": 0.7562316060066223,
              "children": []
            }
          ]
        },
        {
          "id": 4022377319,
          "name": "table/countertop_shape",
          "displayName": "Table/Countertop Shape",
          "taxons": [
            {
              "id": 2881556910,
              "type": "trait",
              "name": "rectangular",
              "displayName": "Rectangular",
              "score": 0.8770500421524048,
              "children": []
            }
          ]
        },
        {
          "id": 1412086231,
          "name": "base/leg_type",
          "displayName": "Base/Leg Type",
          "taxons": [
            {
              "id": 4218308483,
              "type": "trait",
              "name": "sled",
              "displayName": "Sled",
              "score": 0.9359708428382874,
              "children": []
            }
          ]
        }
      ]
    }
  ]
}

Nice! Here’s a visual representation of the data so it’s easier to see. 

A few things to point out that the Dragoneye API has done: 

  • Object detection: identifying the exact location of the coffee table in the image
  • Tag extraction: using computer vision to understand what furniture style, material, etc. the coffee table is

These results will definitely be useful in building out our new home decor app!

Post-Processing Results

Before we wrap up, I just wanted to briefly touch on how we can use the backend to post-process some of the results before sending it back to the server. 

Let’s say that we want to build a feature that visually identifies the styles of the furniture in the image, and then searches for other items of furniture with the same style. We can build an endpoint that takes in an image of furniture, and then returns the furniture styles that are detected.

In main.py, we can add a method predictions_furniture_style:

from typing import Sequence, Set
from dragoneye import (
    ClassificationObjectPrediction,
    Dragoneye,
    Image,
    TaxonPrediction,
)
from fastapi import UploadFile

...

@app.post("/predictions/furniture_style")
async def predictions_furniture_style(
    image_file: UploadFile,
) -> Sequence[str]:
    def get_furniture_style_predictions(
        prediction: ClassificationObjectPrediction,
    ) -> Set[str]:
        def get_child_predictions_recursive(prediction: TaxonPrediction) -> Set[str]:
            return {
                prediction.displayName,
                *(
                    name
                    for child_prediction in prediction.children
                    for name in get_child_predictions_recursive(child_prediction)
                ),
            }

        furniture_style_prediction = next(
            (
                trait_prediction
                for trait_prediction in prediction.traits
                if trait_prediction.id == 1015792522 # The taxon_id for furniture style
            ),
            None,
        )

        if furniture_style_prediction is None:
            return set()

        return {
            name
            for taxon_prediction in furniture_style_prediction.taxons
            for name in get_child_predictions_recursive(taxon_prediction)
        }

    dragoneye_client = Dragoneye(api_key=API_KEY)

    prediction_result = dragoneye_client.classification.predict(
        image=Image(file_or_bytes=image_file.file.read()),
        model_name="dragoneye/furniture",
    )

    return [
        furniture_style
        for prediction in prediction_result.predictions
        for furniture_style in get_furniture_style_predictions(prediction)
    ]

The function gets the results from the Dragoneye API, finds the trait predictions for furniture styles, and returns all of the predictions. 

One thing of note is that we filtered down the trait prediction using taxon_id - 1015792522 for the furniture style trait type. One question you might have is how to find the appropriate ID for your use case. For that, we’ve built the Taxonomy Explorer tool in the Dragoneye dashboard here. The dashboard includes all of the taxon tags that Dragoneye can predict for, as well as additional information on each taxon such as the ID, related taxons, etc.  

Looking at the interactive docs, we see that “Industrial” is now returned correctly for our image, which we could then plug quickly into our search function in our app! 

Summary

This article shows how easy it is to integrate the Dragoneye API into a backend service such as the one that we built with FastAPI in Python. Of course, you are welcome to use any framework of choice, since the Dragoneye API fundamentally is just a REST API call under the hood.

Hopefully, this has also been helpful in motivating why it can be quite useful to stand up a backend service between your frontend and external APIs, even if it’s simply for proxying!

If you want to get started, you can find all of the code for this article in our examples repo, and you can get signed up for the Dragoneye API here. Currently we haven't implemented API credits in the Free tier yet, but send us an email at support@dragoneye.ai and we're super happy to add some for you manually to get started (mention the ⚾️ emoji in your email).

Let us know if this was helpful and what ideas you have for how to use this. Happy building!

Want to get started?
Tell us what you're building!
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
or try it out now!
Sign up