
Search Project Cases
Selected Work
Client projects and independent work — I'm shaping my portfolio with meaningful projects and strategic decisions that compliment my specialized services.

Mother Tongue – AI Poetics Meets Language Preservation
Tuesday, January 20, 2026
This project explored how artificial intelligence can be used as a tool for cultural preservation rather than abstraction alone. Drawing from endangered languages, oral traditions, and regional folklore, the work translated linguistic structure and symbolism into visual and auditory form, allowing audiences to engage with language beyond direct comprehension.
Using custom-trained language and image-generation models, the project combined AI-generated visuals with original audio fragments to create a visual-poetic series titled Mother Tongue. The work was presented in both academic and public settings, where it has since been used as an educational resource and a reference point for future initiatives focused on language, technology, and cultural memory.

Mother Tongue – AI Poetics Meets Language Preservation
January 20, 2026
This project explored how artificial intelligence can be used as a tool for cultural preservation rather than abstraction alone. Drawing from endangered languages, oral traditions, and regional folklore, the work translated linguistic structure and symbolism into visual and auditory form, allowing audiences to engage with language beyond direct comprehension.
Using custom-trained language and image-generation models, the project combined AI-generated visuals with original audio fragments to create a visual-poetic series titled Mother Tongue. The work was presented in both academic and public settings, where it has since been used as an educational resource and a reference point for future initiatives focused on language, technology, and cultural memory.

Dreamscape Ads – AI-Powered Campaign for Marshall
Tuesday, January 20, 2026
This campaign was created to introduce Bose’s new noise-cancelling headphones through an artistic lens rather than conventional tech marketing. The concept centered on translating sound into emotion, using dreamlike visual environments to show how the headphones transform chaotic noise into calm, immersive experiences.
Ruby developed a series of AI-generated short films titled Dreamscape Ads, where visuals evolved in real time based on the surrounding soundscape—ranging from rainforests and city streets to cafés and white noise environments. Audio-conditioned diffusion models were combined with post-processing in After Effects, and the work was presented through an immersive WebGL microsite. The campaign reached millions across digital platforms, achieved completion rates well above industry benchmarks, and was widely recognized for reframing noise cancellation as a sensory and emotional experience rather than a technical feature.

Dreamscape Ads – AI-Powered Campaign for Marshall
January 20, 2026
This campaign was created to introduce Bose’s new noise-cancelling headphones through an artistic lens rather than conventional tech marketing. The concept centered on translating sound into emotion, using dreamlike visual environments to show how the headphones transform chaotic noise into calm, immersive experiences.
Ruby developed a series of AI-generated short films titled Dreamscape Ads, where visuals evolved in real time based on the surrounding soundscape—ranging from rainforests and city streets to cafés and white noise environments. Audio-conditioned diffusion models were combined with post-processing in After Effects, and the work was presented through an immersive WebGL microsite. The campaign reached millions across digital platforms, achieved completion rates well above industry benchmarks, and was widely recognized for reframing noise cancellation as a sensory and emotional experience rather than a technical feature.

Neural Futures – Global Collaboration Project
Tuesday, January 20, 2026
This project was conceived as an experiment in collaborative authorship, exploring how AI could be used to democratize digital art rather than centralize creative control. The goal was to create a living artwork shaped by collective imagination, inviting contributors from around the world to participate in its evolution.
Participants from over 60 countries submitted images representing what “the future” meant to them, which were continuously interpreted by a custom-trained AI system to generate a massive, ever-changing digital mural. The artwork updated hourly, drawing from real-time submissions and forming a communal visual archive of global perspectives. Designed to be accessible across devices and languages, the project gained widespread attention online, attracted thousands of contributors, and was later presented as a reference model for ethical and participatory AI collaboration.

Neural Futures – Global Collaboration Project
January 20, 2026
This project was conceived as an experiment in collaborative authorship, exploring how AI could be used to democratize digital art rather than centralize creative control. The goal was to create a living artwork shaped by collective imagination, inviting contributors from around the world to participate in its evolution.
Participants from over 60 countries submitted images representing what “the future” meant to them, which were continuously interpreted by a custom-trained AI system to generate a massive, ever-changing digital mural. The artwork updated hourly, drawing from real-time submissions and forming a communal visual archive of global perspectives. Designed to be accessible across devices and languages, the project gained widespread attention online, attracted thousands of contributors, and was later presented as a reference model for ethical and participatory AI collaboration.

Echoes of Light – Interactive AI Art Installation
Tuesday, January 20, 2026
This site-specific installation explored how generative systems could respond directly to human presence, turning movement into an expressive visual language. Drawing from urban textures, city lights, and organic decay, the work was designed to feel reactive rather than pre-rendered, allowing visitors to experience the artwork as something shaped by their own physical motion.
Ruby trained a generative adversarial network on thousands of visual references and combined it with real-time pose estimation and motion sensing. As visitors moved through the space, their gestures were translated into shifting projections across a large-scale wall installation, producing dreamlike visuals that evolved continuously throughout the exhibition. The piece was presented during Nightshift at the LA Digital Art Biennale, where it drew sustained attendance, received critical recognition, and was later acquired for permanent display by a Berlin-based digital art museum.

Echoes of Light – Interactive AI Art Installation
January 20, 2026
This site-specific installation explored how generative systems could respond directly to human presence, turning movement into an expressive visual language. Drawing from urban textures, city lights, and organic decay, the work was designed to feel reactive rather than pre-rendered, allowing visitors to experience the artwork as something shaped by their own physical motion.
Ruby trained a generative adversarial network on thousands of visual references and combined it with real-time pose estimation and motion sensing. As visitors moved through the space, their gestures were translated into shifting projections across a large-scale wall installation, producing dreamlike visuals that evolved continuously throughout the exhibition. The piece was presented during Nightshift at the LA Digital Art Biennale, where it drew sustained attendance, received critical recognition, and was later acquired for permanent display by a Berlin-based digital art museum.

AI Meets Fashion – Digital Runway for Chromat
Tuesday, January 20, 2026
This project reimagined the traditional fashion show as a digital-first experience, responding to the shift toward remote audiences and interactive media. The goal was not to replicate a physical runway, but to extend Chromat’s visual identity into a speculative space where form, motion, and texture could exist without physical constraints.
Ruby collaborated closely with Chromat’s design team to train a custom AI model using archival collections, runway footage, and experimental material studies. The resulting digital runway featured AI-generated models wearing garments that echoed Chromat’s aesthetic while pushing it into surreal territory. Streamed online with interactive elements, the experience allowed viewers to influence visual transitions in real time, generating significant engagement and broad media coverage across fashion and technology publications.

AI Meets Fashion – Digital Runway for Chromat
January 20, 2026
This project reimagined the traditional fashion show as a digital-first experience, responding to the shift toward remote audiences and interactive media. The goal was not to replicate a physical runway, but to extend Chromat’s visual identity into a speculative space where form, motion, and texture could exist without physical constraints.
Ruby collaborated closely with Chromat’s design team to train a custom AI model using archival collections, runway footage, and experimental material studies. The resulting digital runway featured AI-generated models wearing garments that echoed Chromat’s aesthetic while pushing it into surreal territory. Streamed online with interactive elements, the experience allowed viewers to influence visual transitions in real time, generating significant engagement and broad media coverage across fashion and technology publications.