top of page

Week 14: Building the Heart of the System – From Emotions to Architecture

  • Apr 2, 2025
  • 2 min read

Updated: May 21, 2025

Last week, we outlined the main stages of the user journey — from initiating the process to documenting, reflecting, and ultimately saving or sharing the memory. This week, we dove into the tech that will make it all possible. 🌐✨

From abstract emotions to concrete code – here’s what we built:


🧠 The Architecture Behind the Memories

Our system is designed as a client-server architecture, optimized for emotional storytelling, intuitive interaction, and secure memory preservation. It combines mobile UI, emotional content analysis, visual recognition, and advanced AI models.


Let’s break it down:

📱 Client Side – Emotional Companion App

Built with React Native (or Flutter), our mobile app is the user’s personal space to capture and reflect. It includes:

  • Personalized onboarding with emotional profiling

  • Memory documentation screens by object or theme

  • Guided question flows for emotional storytelling

  • Organized memory library (by emotion, date, topic)

  • Session summaries that reflect back personal emotional insights

Our goal here? Create a warm, human experience that feels more like journaling with a friend than using an app.


🧰 Server Side – The Silent Engine

The backend (Node.js or Flask) powers everything behind the scenes:

  • User and memory data management

  • Emotional content analysis using AI

  • Personalization logic

  • Secure storage with flexible data models

It’s the brain that processes and organizes everything users share.


🗃 Database – Emotionally-Aware Storage

Using Firestore or MongoDB, we’ve structured the data to capture the full story behind every object:

  • Users – personal info, emotional state, family links

  • Memories – texts, media, tags

  • Objects – item metadata and images

  • Sessions – every interaction

  • EmotionalTags – from “hope” to “longing”

  • FamilyLinks – allowing private sharing across relatives


🤖 AI Models – Feeling the Words

To bring emotional depth, we integrated a suite of natural language and machine learning models:

  • VADER – for fast sentiment analysis of short text

  • BERT (fine-tuned on GoEmotions) – detecting complex emotions like pride or nostalgia

  • GPT (via API) – generating personalized emotional summaries

  • LDA Topic Modeling – organizing memories into clusters like “grandma’s kitchen” or “military service”

Each model helps translate raw memory into meaningful insight.


🔍 OCR – Seeing the Written Word

Objects often contain inscriptions, recipes, or handwritten notes. That’s where Tesseract OCR or Google Vision API comes in – extracting text from images to help preserve those tiny but powerful details.


🔐 Privacy & Security – Trust First

Because memories are sacred, we built strong foundations for privacy and control:

  • End-to-end encryption of all user content

  • Clear permission settings for every memory (private / family / shared)

  • Secure login and authentication

  • Full user ownership — including the ability to export all memories as an offline digital memory book (PDF/ZIP), grouped by topic or emotion

We’re not just storing content. We’re preserving legacy.




 
 
 

Comments


© 2024 by by MiLab. Powered and secured by Wix

bottom of page