rdune71 commited on
Commit
860bf55
·
1 Parent(s): b127732

Add proper Hugging Face Spaces configuration to README.md

Browse files
Files changed (1) hide show
  1. README.md +62 -0
README.md CHANGED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ title: AI Life Coach
3
+ emoji: 🧘
4
+ colorFrom: purple
5
+ colorTo: blue
6
+ sdk: streamlit
7
+ sdk_version: "1.24.0"
8
+ app_file: app.py
9
+ pinned: false
10
+ ---
11
+
12
+ # AI Life Coach 🧘
13
+
14
+ Your personal AI-powered life coaching assistant.
15
+
16
+ ## Features
17
+
18
+ - Personalized life coaching conversations
19
+ - Redis-based conversation memory
20
+ - Multiple LLM provider support (Ollama, Hugging Face, OpenAI)
21
+ - Dynamic model selection
22
+ - Remote Ollama integration via ngrok
23
+
24
+ ## How to Use
25
+
26
+ 1. Select a user from the sidebar
27
+ 2. Configure your Ollama connection (if using remote Ollama)
28
+ 3. Choose your preferred model
29
+ 4. Start chatting with your AI Life Coach!
30
+
31
+ ## Requirements
32
+
33
+ All requirements are specified in `requirements.txt`. The app automatically handles:
34
+ - Streamlit UI
35
+ - FastAPI backend (for future expansion)
36
+ - Redis connection for persistent memory
37
+ - Multiple LLM integrations
38
+
39
+ ## Environment Variables
40
+
41
+ Configure these in your Hugging Face Space secrets or local `.env` file:
42
+
43
+ - `OLLAMA_HOST`: Your Ollama server URL (default: ngrok URL)
44
+ - `LOCAL_MODEL_NAME`: Default model name (default: mistral)
45
+ - `HF_TOKEN`: Hugging Face API token (for Hugging Face models)
46
+ - `HF_API_ENDPOINT_URL`: Hugging Face inference API endpoint
47
+ - `USE_FALLBACK`: Whether to use fallback providers (true/false)
48
+ - `REDIS_HOST`: Redis server hostname (default: localhost)
49
+ - `REDIS_PORT`: Redis server port (default: 6379)
50
+ - `REDIS_USERNAME`: Redis username (optional)
51
+ - `REDIS_PASSWORD`: Redis password (optional)
52
+
53
+ ## Architecture
54
+
55
+ This application consists of:
56
+ - Streamlit frontend (`app.py`)
57
+ - Core LLM abstraction (`core/llm.py`)
58
+ - Memory management (`core/memory.py`)
59
+ - Configuration management (`utils/config.py`)
60
+ - API endpoints (in `api/` directory for future expansion)
61
+
62
+ Built with Python, Streamlit, FastAPI, and Redis.