Artera
Artera
Year
2024
Industry
Art & Culture Tech
Scope of work
/
Backend
/
Django
/
AWS
Timeline
Sept 2023 – Dec 2024
Year
2024
Industry
Art & Culture Tech
Scope of work
/
Backend
/
Django
/
AWS
Timeline
Sept 2023 – Dec 2024
Engineered the Infrastructure That Powers Artera’s AI Art Matching and Real-Time Search
Engineered the Infrastructure That Powers Artera’s AI Art Matching and Real-Time Search
When I joined Artera, the backend was a single oversized Django app running on a basic Linux machine — no containerization, no separation of concerns. I led the transformation by modularizing the codebase, introducing clean domain-specific apps, and dockerizing the system for scalable deployment via AWS ECS and a load-balanced architecture. This enabled high availability and horizontal scaling, essential for supporting an AI-powered user experience. One of the key product features I helped bring to life was real-time AI-driven artwork discovery. Users could upload selfies or photos, and Artera’s AI engine would find visually similar art from a database of nearly a million pieces. I engineered the REST APIs and data bridges between the client and OpenSearch, the system used to handle high-performance vector search and ranking. My APIs ensured paginated, permission-aware access to OpenSearch, applying serialization layers that normalized AI responses for frontend consistency — critical for maintaining Redux state and user trust. By abstracting access to the AI and OpenSearch layer through secure, documented endpoints, I enabled the frontend to tap into AI capabilities with zero friction, while maintaining performance, safety, and schema integrity. The result was a reliable backend that seamlessly served AI-powered recommendations, trending artworks, and user-curated collections — at scale.
When I joined Artera, the backend was a single oversized Django app running on a basic Linux machine — no containerization, no separation of concerns. I led the transformation by modularizing the codebase, introducing clean domain-specific apps, and dockerizing the system for scalable deployment via AWS ECS and a load-balanced architecture. This enabled high availability and horizontal scaling, essential for supporting an AI-powered user experience. One of the key product features I helped bring to life was real-time AI-driven artwork discovery. Users could upload selfies or photos, and Artera’s AI engine would find visually similar art from a database of nearly a million pieces. I engineered the REST APIs and data bridges between the client and OpenSearch, the system used to handle high-performance vector search and ranking. My APIs ensured paginated, permission-aware access to OpenSearch, applying serialization layers that normalized AI responses for frontend consistency — critical for maintaining Redux state and user trust. By abstracting access to the AI and OpenSearch layer through secure, documented endpoints, I enabled the frontend to tap into AI capabilities with zero friction, while maintaining performance, safety, and schema integrity. The result was a reliable backend that seamlessly served AI-powered recommendations, trending artworks, and user-curated collections — at scale.
When I joined Artera, the backend was a single oversized Django app running on a basic Linux machine — no containerization, no separation of concerns. I led the transformation by modularizing the codebase, introducing clean domain-specific apps, and dockerizing the system for scalable deployment via AWS ECS and a load-balanced architecture. This enabled high availability and horizontal scaling, essential for supporting an AI-powered user experience. One of the key product features I helped bring to life was real-time AI-driven artwork discovery. Users could upload selfies or photos, and Artera’s AI engine would find visually similar art from a database of nearly a million pieces. I engineered the REST APIs and data bridges between the client and OpenSearch, the system used to handle high-performance vector search and ranking. My APIs ensured paginated, permission-aware access to OpenSearch, applying serialization layers that normalized AI responses for frontend consistency — critical for maintaining Redux state and user trust. By abstracting access to the AI and OpenSearch layer through secure, documented endpoints, I enabled the frontend to tap into AI capabilities with zero friction, while maintaining performance, safety, and schema integrity. The result was a reliable backend that seamlessly served AI-powered recommendations, trending artworks, and user-curated collections — at scale.



Built Robust Environment Separation for Safer Releases and Scalable Development
Built Robust Environment Separation for Safer Releases and Scalable Development
To support a growing engineering team and continuous iteration on AI-driven features, I implemented a clean multi-environment setup for both staging and production. Before my changes, deployments were all running off a single live environment — making testing risky and often unreliable. I introduced Django multi-settings configuration, containerized each environment, and ensured environment variables were securely managed per deployment using ECS task definitions. This separation allowed developers to safely test features like AI-generated recommendations and frontend UI changes without risking live data. It also enabled automated deployments, version tracking, and consistent behavior across environments — a foundational step in turning Artera into a scalable, production-grade platform.
To support a growing engineering team and continuous iteration on AI-driven features, I implemented a clean multi-environment setup for both staging and production. Before my changes, deployments were all running off a single live environment — making testing risky and often unreliable. I introduced Django multi-settings configuration, containerized each environment, and ensured environment variables were securely managed per deployment using ECS task definitions. This separation allowed developers to safely test features like AI-generated recommendations and frontend UI changes without risking live data. It also enabled automated deployments, version tracking, and consistent behavior across environments — a foundational step in turning Artera into a scalable, production-grade platform.
To support a growing engineering team and continuous iteration on AI-driven features, I implemented a clean multi-environment setup for both staging and production. Before my changes, deployments were all running off a single live environment — making testing risky and often unreliable. I introduced Django multi-settings configuration, containerized each environment, and ensured environment variables were securely managed per deployment using ECS task definitions. This separation allowed developers to safely test features like AI-generated recommendations and frontend UI changes without risking live data. It also enabled automated deployments, version tracking, and consistent behavior across environments — a foundational step in turning Artera into a scalable, production-grade platform.



Want to be our next case study? Let’s build something worth showcasing.
© 2025 ForgeOne
Let’s talk.
Tell us about your project—whether it’s a scalable backend system, a custom SaaS platform.
Quick response.
We respond quickly and give every inquiry the attention it deserves.
Clear next steps.
After your consultation, we’ll outline a clear plan with timelines and next steps.
Let’s talk.
Tell us about your project—whether it’s a scalable backend system, a custom SaaS platform.
Quick response.
We respond quickly and give every inquiry the attention it deserves.
Clear next steps.
After your consultation, we’ll outline a clear plan with timelines and next steps.
© 2025 ForgeOne
Let’s talk.
Tell us about your project—whether it’s a scalable backend system, a custom SaaS platform.
Quick response.
We respond quickly and give every inquiry the attention it deserves.
Clear next steps.
After your consultation, we’ll outline a clear plan with timelines and next steps.