Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added team-152/Pitch Deck Marhaba Morocco.pdf
Binary file not shown.
76 changes: 76 additions & 0 deletions team-152/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,76 @@
# Hackathon


- Demo Link of the Marhaba-Morocco app [Marhaba-Morocco-app](https://marhaba-morocco.streamlit.app/)
- Figma design of the Marhaba-Morocco [Marhaba-Morocco-design](https://www.figma.com/design/e2Th6xQ4EvAaYwoPeHBzeN/Marhaba-Morocco?node-id=154-15406&t=giX5iXYplDUsToEY-1)
- Watch the video presentation of the Marhaba-Morocco app [video-presentation](https://drive.google.com/file/d/1eGzoz7EJ2z6cKwwEHQv68WUKnWRmE1mN/view?usp=sharing).
- Watch the video demo of the Marhaba-Morocco app [video-demo](https://drive.google.com/file/d/1P2S0oJvJQAcTiU3ctTCJRs2uIA5ngP43/view?usp=sharing).



## Abstract

### Background and Problem Statement

In today's fast-paced world, travelers often face challenges in finding reliable information about accommodations, local laws, cultural insights, and trip planning in Morocco. This project aims to streamline the process of gathering and providing this information, making it easier for users to plan their trips effectively.

### Impact and Proposed Solution

The proposed solution is a virtual concierge application that leverages AI to assist users in obtaining tailored recommendations for hotels, cultural insights, and legal information. By integrating various APIs and utilizing natural language processing, the application can provide accurate and relevant responses to user queries, enhancing the travel experience and ensuring users have access to the information they need.

### Project Outcomes and Deliverables

- A fully functional virtual concierge application that can:
- Provide hotel recommendations based on user preferences.
- Answer questions related to Moroccan culture and traditions.
- Offer insights into local laws and regulations.
- Clear documentation on how to use the application.
- A user-friendly interface built with Streamlit for easy interaction.

## Instructions

1. **Clone the Repository**

```bash
git clone <repository-url>
cd <repository-directory>
```

2. **Set Up Environment**

- Ensure you have Python 3.8 or higher installed.
- Create a virtual environment:
```bash
python -m venv venv
source venv/bin/activate # On Windows use `venv\Scripts\activate`
```

3. **Install Dependencies**

- Install the required libraries:
```bash
pip install -r requirements.txt
```

4. **Set Up Environment Variables**

- Create a `.env` file in the root directory and add your API keys:
```
OPENAI_API_KEY=<your_openai_api_key>
PINECONE_API_KEY=<your_pinecone_api_key>
HOTELS_API_KEY=<your_hotels_api_key>
```

5. **Run the Application**

- Start the Streamlit application:
```bash
streamlit run main.py
```

6. **Interact with the Application**
- Open your web browser and go to `http://localhost:8501` to start using the virtual concierge.

## Video Demo

- Watch the video demo of the Marhaba-Morocco app [here](https://drive.google.com/file/d/1P2S0oJvJQAcTiU3ctTCJRs2uIA5ngP43/view?usp=sharing).
Binary file added team-152/__pycache__/culture.cpython-313.pyc
Binary file not shown.
Binary file added team-152/__pycache__/dispatcher.cpython-313.pyc
Binary file not shown.
Binary file added team-152/__pycache__/hotels.cpython-313.pyc
Binary file not shown.
Binary file added team-152/__pycache__/law.cpython-313.pyc
Binary file not shown.
106 changes: 106 additions & 0 deletions team-152/culture.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,106 @@
import os
from openai import OpenAI
from dotenv import load_dotenv
from datetime import date, datetime
import requests
import json

class Culture:
def __init__(self):

load_dotenv()
self.hotelsKey = os.environ["HOTELS_API_KEY"]

self.system_prompt = """
You are a well-informed virtual concierge specializing in Moroccan tourism. With a deep understanding of the region's attractions, culture, and hospitality, you are dedicated to helping travelers get exactly what they need. Your skill lies in accurately interpreting user requests and efficiently assigning them to the correct process.

Your task is to help users find specific information related to Moroccan culture, including but not limited to food, music, history, festivals, art, and traditions. If the user is asking about Moroccan culture, your goal is to:

1. Analyze the user's input to understand what aspect of Moroccan culture they are interested in (e.g., food, music, festivals, history, art).
2. Extract the relevant keywords or phrases from the user's query that will help in conducting a Google search for information about Moroccan culture.
3. Refactor these keywords or phrases into a format suitable for a Google search query, where the keywords are space-separated.

Instructions:
- If the user asks about a specific aspect of Moroccan culture (e.g., "Tell me about Moroccan food" or "What are the popular Moroccan festivals?"), extract keywords such as "food" or "festivals" and format them for a Google search.
- If the user's query is unclear, politely ask them to specify what aspect of Moroccan culture they would like to know more about.
- If the user's request is not related to Moroccan culture, respond according to the normal flow of the concierge process (hotel recommendations, trip planning, local laws, etc.).

Example 1:
User Input: "Tell me about the food in Morocco"
Output: "food Moroccan cuisine traditional Moroccan dishes"

Example 2:
User Input: "What are some popular Moroccan festivals?"
Output: "festivals Moroccan festivals celebrations"

Your response should be a single string of space-separated keywords, ready to be used in a Google search query.
"""
self.client = OpenAI()

def refactor(self,prompt):

completion = self.client.chat.completions.create(
model="gpt-4o-mini", # Use GPT-4 or gpt-3.5-turbo
messages=[
{"role": "system", "content": self.system_prompt},
{"role": "user", "content": str(prompt)}
],
temperature=0.5 # Set to 0 for deterministic responses
)
return str(completion.choices[0].message.content)

def search_api(self,search):
print(search)
url = "https://searx-search-api.p.rapidapi.com/search"

data = {"q":search,"format":"json"}

headers = {
"x-rapidapi-key": self.hotelsKey,
"x-rapidapi-host": "searx-search-api.p.rapidapi.com"
}

response = requests.get(url, headers=headers, params=data)
print(response)
return response.json()["results"]

def response(self,prompt,data,msg_hist):
system = f"""
You are a well-informed virtual concierge specializing in Moroccan tourism. Your role is to assist travelers by providing relevant, concise, and accurate information based on their queries.

Your task:
- You have been provided with data retrieved from an API related to Moroccan tourism (e.g., hotel information, cultural insights, or festival details).
- You must analyze the data, interpret it, and generate a polite, clear, and informative response to the user based on the details available.
- The response should be formatted in a conversational manner, addressing the user's query directly using the information from the data.

prompt:
```
{prompt}
```
Data:
```
{data}
```
Your output should:
- Be clear and helpful, summarizing the key points from the data.
- Provide any relevant details directly related to the user's query.
- Avoid including raw data or irrelevant details. Focus on what the user would find most useful based on their request.

Example 1:
User Input: "Tell me about the hotels in Marrakesh."
Data: "Hotel A: 4 stars, $100 per night, near Jemaa el-Fnaa, free Wi-Fi. Hotel B: 5 stars, $200 per night, with pool, located in the medina."
Response: "In Marrakesh, you can find several great hotels. For a budget-friendly option, Hotel A offers a 4-star experience with free Wi-Fi and is conveniently located near Jemaa el-Fnaa for just $100 per night. If you're looking for something more luxurious, Hotel B is a 5-star property with a pool and is located in the medina, priced at $200 per night."

Please ensure that the response is well-structured and helpful to the user, incorporating the relevant data appropriately.
"""

msg_hist_tmp = list(msg_hist)
msg_hist_tmp.insert(0, {"role": "system", "content": prompt})

completion = self.client.chat.completions.create(
model="gpt-4o-mini", # Use GPT-4 or gpt-3.5-turbo
messages=msg_hist_tmp,
temperature=0.7 # Set to 0 for deterministic responses
)

return str(completion.choices[0].message.content)
103 changes: 103 additions & 0 deletions team-152/dispatcher.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
import os
from openai import OpenAI
from dotenv import load_dotenv

class Dispatcher():

def __init__(self):

load_dotenv()
self.openAiKey = os.environ["OPENAI_API_KEY"]

self.system_prompt = """
backstory:
You are a well-informed virtual concierge specializing in Moroccan tourism.
With a deep understanding of the region's attractions, culture, and hospitality,
you are dedicated to helping travelers get exactly what they need. Your skill
lies in accurately interpreting user requests and efficiently assigning them
to the correct process.

goal:
Analyze user input to determine whether they need hotel recommendations,
cultural Q&A, or a full trip plan, and forward the request accordingly.

role:
Tourism Request Dispatcher return only one of those `hotels` , `local_law_qna` , `cultural_qna`, `trip_plan` , `none`

task:
Based on the user's input, determine their request type
Return only one of those `hotels` , `local_law_qna` , `cultural_qna`, `trip_plan` , `none`
"""
self.client = OpenAI()

def run(self,prompt,msg_hist):
# Call the OpenAI API with the new interface
msg_hist_tmp = list(msg_hist)
# msg_hist_tmp.insert(0, {"role": "system", "content": self.system_prompt})
print(msg_hist_tmp)
completion = self.client.chat.completions.create(
model="gpt-4o-mini", # Use GPT-4 or gpt-3.5-turbo
messages=[{"role": "system", "content": self.system_prompt}]+msg_hist_tmp,
temperature=0 # Set to 0 for deterministic responses
)
output = completion.choices[0].message.content
print(output)
if str(output) in ["hotels","local_law_qna","cultural_qna","trip_plan"]:

return output
else:
return "none"

def welcome_msg(self):
prompt = """
Backstory:
You are a knowledgeable virtual concierge with expertise in Moroccan tourism.
You excel at understanding user needs and guiding them toward the best solutions, whether it's accommodations, cultural insights, or full trip planning.

Your role:
Craft a warm and inviting welcome message that starts with 'Marhaba in Morocco!'

Requirements:
- Begin with 'Marhaba in Morocco!'
- Make the message engaging and slightly different from:
'Marhaba in Morocco! How can I assist you today—hotels, culture, trip planning, or local laws?'
- Be a bit creative and keep the tone friendly and helpful.
- maximum 30 words
"""
completion = self.client.chat.completions.create(
model="gpt-4o-mini", # Use GPT-4 or gpt-3.5-turbo
messages=[
{"role": "system", "content": prompt},
{"role": "user", "content": "Hi"}
],
temperature=0.6 # Set to 0 for deterministic responses
)
return str(completion.choices[0].message.content)

def retry_msg(self,msg_hist):
prompt = """
Backstory:
You are a knowledgeable virtual concierge with expertise in Moroccan tourism.
Your primary role is to help users by accurately interpreting their requests and guiding them to the right resources, whether it’s accommodations, cultural insights, trip planning, or local laws.

Context:
The user has provided an unclear or unhelpful response and has not specified their needs.
don't answer to any question that doesn't fit to your role.

Task:
Generate a polite and encouraging message prompting the user to clarify their request.

Requirements:
- Be clear and concise.
- Gently ask the user to specify their interest.
- Example: 'To assist you better, please let me know your focus: hotels, culture, trip planning, or local laws?'
"""

msg_hist_tmp = list(msg_hist)
msg_hist_tmp.insert(0, {"role": "system", "content": prompt})
completion = self.client.chat.completions.create(
model="gpt-4o-mini", # Use GPT-4 or gpt-3.5-turbo
messages=msg_hist_tmp,
temperature=0.6 # Set to 0 for deterministic responses
)
return str(completion.choices[0].message.content)
Loading