13 Commits

Author SHA1 Message Date
dd277287af basis for paypal transactions
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m48s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 50s
Build and release debug APK / Build APK (pull_request) Failing after 3m50s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 27s
2025-02-17 11:54:03 +01:00
f258df8e72 better endpoints
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m15s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 48s
Build and release debug APK / Build APK (pull_request) Failing after 4m38s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 25s
2025-02-17 10:27:02 +01:00
fd091a9ccc fixed print
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m49s
Run linting on the backend code / Build (pull_request) Successful in 30s
Run testing on the backend code / Build (pull_request) Failing after 50s
Build and release debug APK / Build APK (pull_request) Failing after 3m27s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 25s
2025-02-14 17:19:27 +01:00
f81c28f2ac user creation and deletetion endpoint
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m49s
Run linting on the backend code / Build (pull_request) Successful in 46s
Run testing on the backend code / Build (pull_request) Failing after 48s
Build and release debug APK / Build APK (pull_request) Failing after 3m35s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 26s
2025-02-14 16:47:37 +01:00
361b2b1f42 full passed tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m12s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 50s
Build and release debug APK / Build APK (pull_request) Failing after 3m46s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 25s
2025-02-14 10:44:09 +01:00
16918369d7 corrected supabase api communication
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m26s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 49s
Build and release debug APK / Build APK (pull_request) Failing after 3m22s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 25s
2025-02-14 10:33:57 +01:00
2c49480966 supabase implementation
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m42s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 50s
Build and release debug APK / Build APK (pull_request) Failing after 3m6s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-02-12 21:36:15 +01:00
3a9ef4e7d3 starting to implement paywall logic
Some checks failed
Run testing on the backend code / Build (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
Run linting on the backend code / Build (pull_request) Has been cancelled
2025-02-11 17:19:03 +01:00
c15e257dea add trip time update
Some checks failed
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
Run testing on the backend code / Build (pull_request) Has been cancelled
Run linting on the backend code / Build (pull_request) Successful in 27s
2025-02-11 15:42:14 +01:00
5a698dd02c Merge pull request 'Adding licenses' (#58) from licenses into main
Reviewed-on: #58
2025-02-11 08:36:16 +00:00
7e4a4b3dc7 added general license 2025-02-11 08:25:02 +00:00
84e5902436 Merge pull request 'fixed cluster names' (#57) from backend/fix/missing-cluster-names into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Successful in 2m9s
/ push-to-remote (push) Failing after 41s
Build and deploy the backend to production / Deploy to production (push) Successful in 23s
Reviewed-on: #57
2025-02-11 06:50:11 +00:00
81330e5eb3 fixed cluster names
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m54s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 4m19s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-02-11 07:34:50 +01:00
25 changed files with 2644 additions and 202 deletions

12
.vscode/launch.json vendored
View File

@@ -21,15 +21,21 @@
]
},
{
"name": "Backend - tester",
"name": "Backend - test",
"type": "debugpy",
"request": "launch",
"program": "src/tester.py",
"module": "pytest",
"args": [
"src/tests",
"--log-cli-level=DEBUG",
"--html=report.html",
"--self-contained-html"
],
"env": {
"DEBUG": "true"
},
"cwd": "${workspaceFolder}/backend"
},
},
// frontend - flutter app
{
"name": "Frontend - debug",

30
LICENSE.md Normal file
View File

@@ -0,0 +1,30 @@
# License
## Proprietary License
All code and resources in this repository are the property of AnyDev. The software and related documentation are provided solely for use with services provided by AnyDev. Redistribution, modification, or use of this software outside of its intended service is strictly prohibited without explicit permission.
### Copyright © 2024 AnyDev
All rights reserved.
### Restrictions
- You may not modify, distribute, copy, or reverse engineer any part of this codebase.
- This software is licensed for use solely in conjunction with services provided by AnyDev.
- Any commercial use of this software is strictly prohibited without explicit written consent from AnyDev.
## Third-Party Dependencies
This project uses third-party dependencies, which are subject to their respective licenses.
- Python backend dependencies: fastapi, pydantic, numpy, shapely, etc. Licensed under their respective licenses.
- Flutter frontend dependencies: Cupertino Icons, sliding_up_panel, http, etc. Licensed under their respective licenses.
Please refer to each project's documentation for the specific terms and conditions.
## OpenStreetMap Data Usage
This project uses data derived from **OpenStreetMap**. OpenStreetMap data is available under the [Open Database License (ODbL)](https://www.openstreetmap.org/copyright). We comply with the ODbL license, and some of the data displayed in the service may be derived from OpenStreetMap sources. We do not redistribute raw OpenStreetMap data; instead, it is processed and transformed before being used in our services.
More information about OpenStreetMap data usage can be found [here](https://www.openstreetmap.org/copyright).

3
backend/.gitignore vendored
View File

@@ -1,6 +1,9 @@
# osm-cache
cache_XML/
# secrets
*secrets.yaml
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]

View File

@@ -25,3 +25,5 @@ loki-logger-handler = "*"
pulp = "*"
scipy = "*"
requests = "*"
supabase = "*"
paypalrestsdk = "*"

1077
backend/Pipfile.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -328,7 +328,7 @@ div.media {
</head>
<body>
<h1 id="title">Backend Testing Report</h1>
<p>Report generated on 29-Jan-2025 at 09:35:03 by <a href="https://pypi.python.org/pypi/pytest-html">pytest-html</a>
<p>Report generated on 17-Feb-2025 at 11:20:02 by <a href="https://pypi.python.org/pypi/pytest-html">pytest-html</a>
v4.1.1</p>
<div id="environment-header">
<h2>Environment</h2>
@@ -382,7 +382,7 @@ div.media {
<h2>Summary</h2>
<div class="additional-summary prefix">
</div>
<p class="run-count">1 test took 97 ms.</p>
<p class="run-count">1 test took 524 ms.</p>
<p class="filter">(Un)check the boxes to filter the results.</p>
<div class="summary__reload">
<div class="summary__reload__button hidden" onclick="location.reload()">
@@ -432,7 +432,7 @@ div.media {
</table>
</body>
<footer>
<div id="data-container" data-jsonblob="{&#34;environment&#34;: {&#34;Python&#34;: &#34;3.12.3&#34;, &#34;Platform&#34;: &#34;Linux-6.8.0-51-generic-x86_64-with-glibc2.39&#34;, &#34;Packages&#34;: {&#34;pytest&#34;: &#34;8.3.4&#34;, &#34;pluggy&#34;: &#34;1.5.0&#34;}, &#34;Plugins&#34;: {&#34;html&#34;: &#34;4.1.1&#34;, &#34;anyio&#34;: &#34;4.8.0&#34;, &#34;metadata&#34;: &#34;3.1.1&#34;}}, &#34;tests&#34;: {&#34;src/tests/test_main.py::test_turckheim&#34;: [{&#34;extras&#34;: [], &#34;result&#34;: &#34;Passed&#34;, &#34;testId&#34;: &#34;src/tests/test_main.py::test_turckheim&#34;, &#34;resultsTableRow&#34;: [&#34;&lt;td class=\&#34;col-result\&#34;&gt;Passed&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-testId\&#34;&gt;src/tests/test_main.py::test_turckheim&lt;/td&gt;&#34;, &#34;&lt;td&gt;start (0 | 0) - 3 - Mairie du 2e arrondissement (78 | 5) - 1 - Basilique Saint-Martin d&#39;Ainay (406 | 5) - 3 - Chapelle Paul Couturier (109 | 5) - 1 - finish (0 | 0) - 0&lt;/td&gt;&#34;, &#34;&lt;td&gt;23 min&lt;/td&gt;&#34;, &#34;&lt;td&gt;20 min&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-duration\&#34;&gt;97 ms&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-links\&#34;&gt;&lt;/td&gt;&#34;], &#34;log&#34;: &#34;------------------------------ Captured log call -------------------------------\nDEBUG asyncio:selector_events.py:64 Using selector: EpollSelector\nINFO src.main:main.py:67 No end coordinates provided. Using start=end.\nDEBUG src.utils.landmarks_manager:landmarks_manager.py:76 Starting to fetch landmarks...\nDEBUG src.utils.landmarks_manager:landmarks_manager.py:88 Fetching sightseeing landmarks...\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nDEBUG src.utils.landmarks_manager:landmarks_manager.py:204 Fetched 16 landmarks of type sightseeing in (45.74643460077641, 4.819803369760263, 45.756693361321055, 4.834505559895031)\nINFO src.utils.landmarks_manager:landmarks_manager.py:91 Found 16 sightseeing landmarks\nINFO src.overpass.overpass:overpass.py:55 Cache hit for 2/2 quadrants.\nINFO src.utils.cluster_manager:cluster_manager.py:145 Found 0 sightseeing clusters.\nINFO src.main:main.py:104 Fetched 12 landmarks in \t: 0.011 seconds\nDEBUG src.optimization.optimizer:optimizer.py:597 First results are out. Looking out for circles and correcting.\nINFO src.optimization.optimizer:optimizer.py:637 Re-optimized 0 times, objective value : 593\nDEBUG src.optimization.refiner:refiner.py:345 Using 2 minor landmarks around the predicted path\nDEBUG src.optimization.optimizer:optimizer.py:597 First results are out. Looking out for circles and correcting.\nINFO src.optimization.optimizer:optimizer.py:637 Re-optimized 0 times, objective value : 593\nDEBUG src.main:main.py:130 First stage optimization\t: 0.045 seconds\nDEBUG src.main:main.py:131 Second stage optimization\t: 0.025 seconds\nINFO src.main:main.py:132 Total computation time\t: 0.071 seconds\nINFO src.main:main.py:137 Generated a trip of 23 minutes with 5 landmarks in 0.082 seconds.\nINFO httpx:_client.py:1025 HTTP Request: POST http://testserver/trip/new &amp;quot;HTTP/1.1 200 OK&amp;quot;\n\n&#34;}]}, &#34;renderCollapsed&#34;: [&#34;passed&#34;], &#34;initialSort&#34;: &#34;result&#34;, &#34;title&#34;: &#34;Backend Testing Report&#34;}"></div>
<div id="data-container" data-jsonblob="{&#34;environment&#34;: {&#34;Python&#34;: &#34;3.12.3&#34;, &#34;Platform&#34;: &#34;Linux-6.8.0-53-generic-x86_64-with-glibc2.39&#34;, &#34;Packages&#34;: {&#34;pytest&#34;: &#34;8.3.4&#34;, &#34;pluggy&#34;: &#34;1.5.0&#34;}, &#34;Plugins&#34;: {&#34;html&#34;: &#34;4.1.1&#34;, &#34;anyio&#34;: &#34;4.8.0&#34;, &#34;metadata&#34;: &#34;3.1.1&#34;}}, &#34;tests&#34;: {&#34;src/tests/test_user.py::test_user_handling&#34;: [{&#34;extras&#34;: [], &#34;result&#34;: &#34;Passed&#34;, &#34;testId&#34;: &#34;src/tests/test_user.py::test_user_handling&#34;, &#34;resultsTableRow&#34;: [&#34;&lt;td class=\&#34;col-result\&#34;&gt;Passed&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-testId\&#34;&gt;src/tests/test_user.py::test_user_handling&lt;/td&gt;&#34;, &#34;&lt;td&gt;N/A&lt;/td&gt;&#34;, &#34;&lt;td&gt;N/A&lt;/td&gt;&#34;, &#34;&lt;td&gt;N/A&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-duration\&#34;&gt;524 ms&lt;/td&gt;&#34;, &#34;&lt;td class=\&#34;col-links\&#34;&gt;&lt;/td&gt;&#34;], &#34;log&#34;: &#34;------------------------------ Captured log call -------------------------------\nDEBUG asyncio:selector_events.py:64 Using selector: EpollSelector\nINFO src.payments.supabase_routes:supabase_routes.py:34 User created successfully, ID: e0b9176d-0211-43fc-b8c7-db815293835e\nDEBUG asyncio:selector_events.py:64 Using selector: EpollSelector\nERROR src.payments.supabase_routes:supabase_routes.py:26 Failed to create user : email_exists\nDEBUG asyncio:selector_events.py:64 Using selector: EpollSelector\nDEBUG src.payments.supabase_routes:supabase_routes.py:44 None\nINFO src.payments.supabase_routes:supabase_routes.py:52 User with ID e0b9176d-0211-43fc-b8c7-db815293835e deleted successfully\nDEBUG asyncio:selector_events.py:64 Using selector: EpollSelector\nERROR src.payments.supabase_routes:supabase_routes.py:47 Failed to delete user : user_not_found\n\n&#34;}]}, &#34;renderCollapsed&#34;: [&#34;passed&#34;], &#34;initialSort&#34;: &#34;result&#34;, &#34;title&#34;: &#34;Backend Testing Report&#34;}"></div>
<script>
(function(){function r(e,n,t){function o(i,f){if(!n[i]){if(!e[i]){var c="function"==typeof require&&require;if(!f&&c)return c(i,!0);if(u)return u(i,!0);var a=new Error("Cannot find module '"+i+"'");throw a.code="MODULE_NOT_FOUND",a}var p=n[i]={exports:{}};e[i][0].call(p.exports,function(r){var n=e[i][1][r];return o(n||r)},p,p.exports,r,e,n,t)}return n[i].exports}for(var u="function"==typeof require&&require,i=0;i<t.length;i++)o(t[i]);return o}return r})()({1:[function(require,module,exports){
const { getCollapsedCategory, setCollapsedIds } = require('./storage.js')

View File

@@ -49,7 +49,7 @@ This file configures the logging system for the application. It defines how logs
This file contains the main application logic and API endpoints for interacting with the system. The application is built using the FastAPI framework, which provides several endpoints for creating trips, fetching trips, and retrieving landmarks or nearby facilities. The key endpoints include:
- **POST /trip/new**:
- This endpoint allows users to create a new trip by specifying preferences, start coordinates, and optionally end coordinates. The preferences guide the optimization process for selecting landmarks.
- This endpoint allows users to create a new trip by specifying user_id, preferences, start coordinates, and optionally end coordinates. The preferences guide the optimization process for selecting landmarks. The user id is needed to verify that the user's credit balance.
- Returns: A `Trip` object containing the optimized route, landmarks, and trip details.
- **GET /trip/{trip_uuid}**:

View File

@@ -12,6 +12,14 @@ LANDMARK_PARAMETERS_PATH = PARAMETERS_DIR / 'landmark_parameters.yaml'
OPTIMIZER_PARAMETERS_PATH = PARAMETERS_DIR / 'optimizer_parameters.yaml'
PAYPAL_CLIENT_ID = os.getenv("future-paypal-client-id", None)
PAYPAL_SECRET = os.getenv("future-paypal-secret", None)
PAYPAL_API_URL = "https://api-m.sandbox.paypal.com"
SUPABASE_URL = os.getenv("SUPABASE_URL", None)
SUPABASE_KEY = os.getenv("SUPABASE_API_KEY", None)
cache_dir_string = os.getenv('OSM_CACHE_DIR', './cache')
OSM_CACHE_DIR = Path(cache_dir_string)

View File

@@ -3,7 +3,7 @@
import logging
import time
from contextlib import asynccontextmanager
from fastapi import FastAPI, HTTPException, BackgroundTasks, Query
from fastapi import FastAPI, HTTPException, BackgroundTasks, Query, Body
from .logging_config import configure_logging
from .structs.landmark import Landmark, Toilets
@@ -16,6 +16,9 @@ from .optimization.optimizer import Optimizer
from .optimization.refiner import Refiner
from .overpass.overpass import fill_cache
from .cache import client as cache_client
from .payments.supabase import Supabase
from .payments.payment_routes import router as payment_router
from .payments.supabase_routes import router as supabase_router
logger = logging.getLogger(__name__)
@@ -23,6 +26,7 @@ logger = logging.getLogger(__name__)
manager = LandmarkManager()
optimizer = Optimizer()
refiner = Refiner(optimizer=optimizer)
supabase = Supabase()
@asynccontextmanager
@@ -37,10 +41,16 @@ async def lifespan(app: FastAPI):
app = FastAPI(lifespan=lifespan)
# Include the payment routes and supabase routes
app.include_router(payment_router)
app.include_router(supabase_router)
@app.post("/trip/new")
def new_trip(preferences: Preferences,
start: tuple[float, float],
end: tuple[float, float] | None = None,
def new_trip(user_id: str = Body(...),
preferences: Preferences = Body(...),
start: tuple[float, float] = Body(...),
end: tuple[float, float] | None = Body(None),
background_tasks: BackgroundTasks = None) -> Trip:
"""
Main function to call the optimizer.
@@ -52,6 +62,19 @@ def new_trip(preferences: Preferences,
Returns:
(uuid) : The uuid of the first landmark in the optimized route
"""
# Check for valid user balance.
try:
if not supabase.check_balance(user_id=user_id):
logger.warning('Insufficient credits to perform this action.')
return {"error": "Insufficient credits"}, 400 # Return a 400 Bad Request with an appropriate message
except SyntaxError as se :
raise HTTPException(status_code=400, detail=str(se)) from se
except ValueError as ve :
raise HTTPException(status_code=406, detail=str(ve)) from ve
except Exception as exc:
raise HTTPException(status_code=500, detail=f"Internal Server Error: {str(exc)}") from exc
# Check for invalid input.
if preferences is None:
raise HTTPException(status_code=406, detail="Preferences not provided or incomplete.")
if (preferences.shopping.score == 0 and
@@ -130,13 +153,15 @@ def new_trip(preferences: Preferences,
logger.debug(f'First stage optimization\t: {round(t_first_stage,3)} seconds')
logger.debug(f'Second stage optimization\t: {round(t_second_stage,3)} seconds')
logger.info(f'Total computation time\t: {round(t_first_stage + t_second_stage,3)} seconds')
linked_tour = LinkedLandmarks(refined_tour)
# upon creation of the trip, persistence of both the trip and its landmarks is ensured.
trip = Trip.from_linked_landmarks(linked_tour, cache_client)
logger.info(f'Generated a trip of {trip.total_time} minutes with {len(refined_tour)} landmarks in {round(t_generate_landmarks + t_first_stage + t_second_stage,3)} seconds.')
logger.debug('Detailed trip :\n\t' + '\n\t'.join(f'{landmark}' for landmark in refined_tour))
background_tasks.add_task(fill_cache)
supabase.decrement_credit_balance(user_id=user_id)
return trip
@@ -178,6 +203,45 @@ def get_landmark(landmark_uuid: str) -> Landmark:
raise HTTPException(status_code=404, detail="Landmark not found") from exc
@app.post("/trip/recompute-time/{trip_uuid}/{removed_landmark_uuid}")
def update_trip_time(trip_uuid: str, removed_landmark_uuid: str) -> Trip:
"""
Updates the reaching times of a given trip when removing a landmark.
Args:
landmark_uuid (str) : unique identifier for a Landmark.
Returns:
(Landmark) : the corresponding Landmark.
"""
# First, fetch the trip in the cache.
try:
trip = cache_client.get(f'trip_{trip_uuid}')
except KeyError as exc:
raise HTTPException(status_code=404, detail='Trip not found') from exc
landmarks = []
next_uuid = trip.first_landmark_uuid
# Extract landmarks
try :
while next_uuid is not None:
landmark = cache_client.get(f'landmark_{next_uuid}')
# Filter out the removed landmark.
if next_uuid != removed_landmark_uuid :
landmarks.append(landmark)
next_uuid = landmark.next_uuid # Prepare for the next iteration
except KeyError as exc:
raise HTTPException(status_code=404, detail=f'landmark {next_uuid} not found') from exc
# Re-link every thing and compute times again
linked_tour = LinkedLandmarks(landmarks)
trip = Trip.from_linked_landmarks(linked_tour, cache_client)
return trip
@app.post("/toilets/new")
def get_toilets(location: tuple[float, float] = Query(...), radius: int = 500) -> list[Toilets] :
"""
@@ -204,3 +268,6 @@ def get_toilets(location: tuple[float, float] = Query(...), radius: int = 500) -
return toilets_list
except KeyError as exc:
raise HTTPException(status_code=404, detail="No toilets found") from exc

View File

@@ -594,7 +594,7 @@ class Optimizer:
status = pl.LpStatus[prob.status]
solution = [pl.value(var) for var in x] # The values of the decision variables (will be 0 or 1)
self.logger.debug("First results are out. Looking out for circles and correcting.")
self.logger.debug("First results are out. Looking out for circles and correcting...")
# Raise error if no solution is found. FIXME: for now this throws the internal server error
if status != 'Optimal' :

View File

@@ -52,7 +52,7 @@ class Overpass :
# Retrieve cached data and identify missing cache entries
cached_responses, non_cached_cells = self._retrieve_cached_data(overlapping_cells, osm_types, selector, conditions, out)
self.logger.info(f'Cache hit for {len(overlapping_cells)-len(non_cached_cells)}/{len(overlapping_cells)} quadrants.')
self.logger.debug(f'Cache hit for {len(overlapping_cells)-len(non_cached_cells)}/{len(overlapping_cells)} quadrants.')
# If there is no missing data, return the cached responses after filtering.
if not non_cached_cells :
@@ -61,6 +61,7 @@ class Overpass :
# If there is no cached data, fetch all from Overpass.
elif not cached_responses :
query_str = Overpass.build_query(bbox, osm_types, selector, conditions, out)
self.logger.debug(f'Query string: {query_str}')
return self.fetch_data_from_api(query_str)
# Hybrid cache: some data from Overpass, some data from cache.
@@ -68,6 +69,7 @@ class Overpass :
# Resize the bbox for smaller search area and build new query string.
non_cached_bbox = Overpass._get_non_cached_bbox(non_cached_cells, bbox)
query_str = Overpass.build_query(non_cached_bbox, osm_types, selector, conditions, out)
self.logger.debug(f'Query string: {query_str}')
non_cached_responses = self.fetch_data_from_api(query_str)
return Overpass._filter_landmarks(cached_responses, bbox) + non_cached_responses

View File

@@ -1,12 +1,11 @@
max_bbox_side: 4000 #m
radius_close_to: 50
church_coeff: 0.55
nature_coeff: 1.4
church_coeff: 0.75
nature_coeff: 1.6
overall_coeff: 10
tag_exponent: 1.15
image_bonus: 1.1
viewpoint_bonus: 5
viewpoint_bonus: 10
wikipedia_bonus: 1.25
name_bonus: 3
N_important: 60
pay_bonus: -1

View File

@@ -5,5 +5,5 @@ max_landmarks: 10
max_landmarks_refiner: 20
overshoot: 0.0016
time_limit: 1
gap_rel: 0.05
gap_rel: 0.025
max_iter: 40

View File

View File

@@ -0,0 +1,70 @@
from typing import Literal
import paypalrestsdk
from pydantic import BaseModel
from fastapi import HTTPException
import logging
# Model for payment request body
class PaymentRequest(BaseModel):
user_id: str
credit_amount: Literal[10, 50, 100]
currency: Literal["USD", "EUR", "CHF"]
description: str = "Purchase of credits"
# Payment handler class for managing PayPal payments
class PaymentHandler:
payment_id: str
def __init__(self, transaction_details: PaymentRequest):
self.details = transaction_details
self.logger = logging.getLogger(__name__)
# Only support purchase of credit 'bundles': 10, 50 or 100 credits worth of trip generation
def fetch_price(self) -> float:
"""
Fetches the price of credits in the specified currency.
"""
result = self.supabase.table("prices").select("credit_amount").eq("currency", self.details.currency).single().execute()
if result.data:
return result.data.get("price")
else:
self.logger.error(f"Unsupported currency: {self.details.currency}")
return None
def create_paypal_payment(self) -> str:
"""
Creates a PayPal payment and returns the approval URL.
"""
price = self.fetch_price()
payment = paypalrestsdk.Payment({
"intent": "sale",
"payer": {
"payment_method": "paypal"
},
"transactions": [{
"amount": {
"total": f"{price:.2f}",
"currency": self.details.currency
},
"description": self.details.description
}],
"redirect_urls": {
"return_url": "http://localhost:8000/payment/success",
"cancel_url": "http://localhost:8000/payment/cancel"
}
})
if payment.create():
self.logger.info("Payment created successfully")
self.payment_id = payment.id
# Get the approval URL and return it for the user to approve
for link in payment.links:
if link.rel == "approval_url":
return link.href
else:
self.logger.error(f"Failed to create payment: {payment.error}")
raise HTTPException(status_code=500, detail="Payment creation failed")

View File

@@ -0,0 +1,79 @@
import logging
import paypalrestsdk
from fastapi import HTTPException, APIRouter
from .payment_handler import PaymentRequest, PaymentHandler
from .supabase import Supabase
# Set up logging and supabase
logger = logging.getLogger(__name__)
supabase = Supabase()
# Configure PayPal SDK
paypalrestsdk.configure({
"mode": "sandbox", # Use 'live' for production
"client_id": "YOUR_PAYPAL_CLIENT_ID",
"client_secret": "YOUR_PAYPAL_SECRET"
})
# Define the API router
router = APIRouter()
@router.post("/purchase/credits")
def purchase_credits(payment_request: PaymentRequest):
"""
Handles token purchases. Calculates the number of tokens based on the amount paid,
updates the user's balance, and processes PayPal payment.
"""
payment_handler = PaymentHandler(payment_request)
# Create PayPal payment and get the approval URL
approval_url = payment_handler.create_paypal_payment()
return {
"message": "Purchase initiated successfully",
"payment_id": payment_handler.payment_id,
"credits": payment_request.credit_amount,
"approval_url": approval_url,
}
@router.get("/payment/success")
def payment_success(paymentId: str, PayerID: str):
"""
Handles successful PayPal payment.
"""
payment = paypalrestsdk.Payment.find(paymentId)
if payment.execute({"payer_id": PayerID}):
logger.info("Payment executed successfully")
# Retrieve transaction details from the database
result = supabase.table("pending_payments").select("*").eq("payment_id", paymentId).single().execute()
if not result.data:
raise HTTPException(status_code=404, detail="Transaction not found")
# Extract the necessary information
user_id = result.data["user_id"]
credit_amount = result.data["credit_amount"]
# Update the user's balance
supabase.increment_credit_balance(user_id, amount=credit_amount)
# Optionally, delete the pending payment entry since the transaction is completed
supabase.table("pending_payments").delete().eq("payment_id", paymentId).execute()
return {"message": "Payment completed successfully"}
else:
logger.error(f"Payment execution failed: {payment.error}")
raise HTTPException(status_code=500, detail="Payment execution failed")
@router.get("/payment/cancel")
def payment_cancel():
"""
Handles PayPal payment cancellation.
"""
return {"message": "Payment was cancelled"}

View File

@@ -0,0 +1,170 @@
import os
import logging
import yaml
from fastapi import HTTPException, status
from supabase import create_client, Client, ClientOptions
from ..constants import PARAMETERS_DIR
# Silence the supabase logger
logging.getLogger("httpx").setLevel(logging.CRITICAL)
logging.getLogger("hpack").setLevel(logging.CRITICAL)
logging.getLogger("httpcore").setLevel(logging.CRITICAL)
class Supabase:
logger = logging.getLogger(__name__)
def __init__(self):
with open(os.path.join(PARAMETERS_DIR, 'secrets.yaml')) as f:
secrets = yaml.safe_load(f)
self.SUPABASE_URL = secrets['SUPABASE_URL']
self.SUPABASE_ADMIN_KEY = secrets['SUPABASE_ADMIN_KEY']
self.SUPABASE_TEST_USER_ID = secrets['SUPABASE_TEST_USER_ID']
self.supabase = create_client(
self.SUPABASE_URL,
self.SUPABASE_ADMIN_KEY,
options=ClientOptions(schema='public')
)
self.logger.debug('Supabase client initialized.')
def check_balance(self, user_id: str) -> bool:
"""
Checks if the user has enough 'credit' for generating a new trip.
Args:
user_id (str): The ID of the current user.
Returns:
bool: True if the balance is positive, False otherwise.
"""
try:
# Query the public.credits table to get the user's credits
response = (
self.supabase.table("credits")
.select('*')
.eq('id', user_id)
.single()
.execute()
)
# self.logger.critical(response)
except Exception as e:
if e.code == '22P02' :
self.logger.error(f"Failed querying credits : {str(e)}")
raise SyntaxError(f"Failed querying credits : {str(e)}") from e
if e.code == 'PGRST116' :
self.logger.error(f"User not found : {str(e)}")
raise ValueError(f"User not found : {str(e)}") from e
else :
self.logger.error(f"An unexpected error occured while checking user balance : {str(e)}")
raise Exception(f"An unexpected error occured while checking user balance : {str(e)}") from e
# Proceed to check the user's credit balance
credits = response.data['credit_amount']
self.logger.debug(f'Credits of user {user_id}: {credits}')
if credits > 0:
self.logger.info(f'Credit balance is positive for user {user_id}. Proceeding with trip generation.')
return True
self.logger.warning(f'Insufficient balance for user {user_id}. Trip generation cannot proceed.')
return False
def decrement_credit_balance(self, user_id: str, amount: int=1) -> bool:
"""
Decrements the user's credit balance by 1.
Args:
user_id (str): The ID of the current user.
"""
try:
# Query the public.credits table to get the user's current credits
response = (
self.supabase.table("credits")
.select('*')
.eq('id', user_id)
.single()
.execute()
)
except Exception as e:
if e.code == '22P02' :
self.logger.error(f"Failed decrementing credits : {str(e)}")
raise SyntaxError(f"Failed decrementing credits : {str(e)}") from e
if e.code == 'PGRST116' :
self.logger.error(f"User not found : {str(e)}")
raise ValueError(f"User not found : {str(e)}") from e
else :
self.logger.error(f"An unexpected error occured while decrementing user balance : {str(e)}")
raise Exception(f"An unexpected error occured while decrementing user balance : {str(e)}") from e
current_credits = response.data['credit_amount']
updated_credits = current_credits - amount
# Update the user's credits in the table
update_response = (
self.supabase.table('credits')
.update({'credit_amount': updated_credits})
.eq('id', user_id)
.execute()
)
# Check if the update was successful
if update_response.data:
self.logger.debug(f'Credit balance successfully decremented.')
return True
else:
raise Exception("Error decrementing credit balance.")
def increment_credit_balance(self, user_id: str, amount: int=1) -> bool:
"""
Increments the user's credit balance by 1.
Args:
user_id (str): The ID of the current user.
"""
try:
# Query the public.credits table to get the user's current credits
response = (
self.supabase.table("credits")
.select('*')
.eq('id', user_id)
.single()
.execute()
)
except Exception as e:
if e.code == '22P02' :
self.logger.error(f"Failed incrementing credits : {str(e)}")
raise SyntaxError(f"Failed incrementing credits : {str(e)}") from e
if e.code == 'PGRST116' :
self.logger.error(f"User not found : {str(e)}")
raise ValueError(f"User not found : {str(e)}") from e
else :
self.logger.error(f"An unexpected error occured while incrementing user balance : {str(e)}")
raise Exception(f"An unexpected error occured while incrementing user balance : {str(e)}") from e
current_credits = response.data['credit_amount']
updated_credits = current_credits + amount
# Update the user's credits in the table
update_response = (
self.supabase.table('credits')
.update({'credit_amount': updated_credits})
.eq('id', user_id)
.execute()
)
# Check if the update was successful
if update_response.data:
self.logger.debug(f'Credit balance successfully incremented.')
return True
else:
raise Exception("Error incrementing credit balance.")

View File

@@ -0,0 +1,52 @@
"""Endpoints for supabase user handling."""
import logging
from fastapi import APIRouter, HTTPException
from .supabase import Supabase
# Set up logging and supabase.
logger = logging.getLogger(__name__)
supabase = Supabase()
# Create fastapi router
router = APIRouter()
@router.post("/user/create/{email}/{password}")
def register_user(email: str, password: str) -> str:
try:
response = supabase.supabase.auth.admin.create_user({
"email": email,
"password": password
})
except Exception as e:
if e.code == 'email_exists' :
logger.error(f"Failed to create user : {str(e.code)}")
raise HTTPException(status_code=422, detail=str(e)) from e
logger.error(f"Failed to create user : {str(e.code)}")
raise HTTPException(status_code=500, detail=str(e)) from e
# Extract the identity_id and user_id
user_id = response.user.id
logger.info(f"User created successfully, ID: {user_id}")
return user_id
@router.post("/user/delete/{user_id}")
def delete_user(user_id: str):
try:
response = supabase.supabase.auth.admin.delete_user(user_id)
logger.debug(response)
except Exception as e:
if e.code == 'user_not_found' :
logger.error(f"Failed to delete user : {str(e.code)}")
raise HTTPException(status_code=404, detail=str(e)) from e
logger.error(f"Failed to create user : {str(e.code)}")
raise HTTPException(status_code=500, detail=str(e)) from e
logger.info(f"User with ID {user_id} deleted successfully")

View File

@@ -4,6 +4,7 @@ from fastapi.testclient import TestClient
import pytest
from ..main import app
from ..constants import SUPABASE_TEST_USER_ID
@pytest.fixture(scope="module")
@@ -55,8 +56,38 @@ def test_input(invalid_client, start, preferences, status_code): # pylint: dis
response = invalid_client.post(
"/trip/new",
json={
"user_id": SUPABASE_TEST_USER_ID,
"preferences": preferences,
"start": start
}
)
assert response.status_code == status_code
@pytest.mark.parametrize(
"user_id,status_code",
[
# No user id :
({}, 422),
("invalid_user_id", 400),
# ("12345678-1234-5678-1234-567812345678", 406)
]
)
def test_input(invalid_client, user_id, status_code): # pylint: disable=redefined-outer-name
"""
Test new trip creation with invalid user ID.
"""
response = invalid_client.post(
"/trip/new",
json={
"user_id": user_id,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 0},
"max_time_minute": 20,
"detour_tolerance_minute": 0},
"start": [48.084588, 7.280405]
}
)
assert response.status_code == status_code

View File

@@ -1,10 +1,17 @@
"""Collection of tests to ensure correct implementation and track progress. """
import time
import logging
from fastapi.testclient import TestClient
import pytest
from .test_utils import load_trip_landmarks, log_trip_details
from ..main import app
from ..payments.supabase import Supabase
supabase = Supabase()
logger = logging.getLogger(__name__)
USER_ID = supabase.SUPABASE_TEST_USER_ID
@pytest.fixture(scope="module")
def client():
@@ -22,21 +29,24 @@ def test_turckheim(client, request): # pylint: disable=redefined-outer-name
"""
start_time = time.time() # Start timer
duration_minutes = 20
logger.debug('Running test in Turckheim')
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 0},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
# "start": [48.084588, 7.280405]
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 0},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [48.084588, 7.280405]
# "start": [45.74445023349939, 4.8222687890538865]
"start": [45.75156398104873, 4.827154464827647]
# "start": [45.75156398104873, 4.827154464827647]
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
@@ -58,6 +68,8 @@ def test_turckheim(client, request): # pylint: disable=redefined-outer-name
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
# assert 2!= 3
def test_bellecour(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Lyon centre to ensure proper decision making in crowded area.
@@ -73,6 +85,7 @@ def test_bellecour(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
@@ -82,6 +95,7 @@ def test_bellecour(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -113,6 +127,7 @@ def test_cologne(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
@@ -122,6 +137,7 @@ def test_cologne(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -154,6 +170,7 @@ def test_strasbourg(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
@@ -163,6 +180,7 @@ def test_strasbourg(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -195,6 +213,7 @@ def test_zurich(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
@@ -204,6 +223,7 @@ def test_zurich(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -236,6 +256,7 @@ def test_paris(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 5},
@@ -245,6 +266,7 @@ def test_paris(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -277,6 +299,7 @@ def test_new_york(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
@@ -286,6 +309,7 @@ def test_new_york(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -318,6 +342,7 @@ def test_shopping(client, request) : # pylint: disable=redefined-outer-name
response = client.post(
"/trip/new",
json={
"user_id": USER_ID,
"preferences": {"sightseeing": {"type": "sightseeing", "score": 0},
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 5},
@@ -327,6 +352,7 @@ def test_shopping(client, request) : # pylint: disable=redefined-outer-name
}
)
result = response.json()
supabase.increment_credit_balance(user_id=USER_ID)
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
@@ -342,4 +368,4 @@ def test_shopping(client, request) : # pylint: disable=redefined-outer-name
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"

View File

@@ -0,0 +1,48 @@
"""Collection of tests to ensure correct handling of user data."""
from fastapi.testclient import TestClient
import pytest
from ..main import app
TEST_EMAIL = "dummy@example.com"
TEST_PW = "DummyPassword123"
@pytest.fixture(scope="module")
def client():
"""Client used to call the app."""
return TestClient(app)
def test_user_handling(client) :
"""
Test the creation of a new user.
"""
# Create a new user
response = client.post(f"/user/create/{TEST_EMAIL}/{TEST_PW}")
# Verify user has been created
assert response.status_code == 200, "Failed to create dummy user"
user_id = response.json()
# Create same user again to raise an error
response = client.post(f"/user/create/{TEST_EMAIL}/{TEST_PW}")
# Verify user already exists
assert response.status_code == 422, "Failed to simulate dummy user already created."
# Delete the user.
response = client.post(f"/user/delete/{user_id}")
# Verify user has been deleted
assert response.status_code == 200, "Failed to delete dummy user."
# Delete the user again to raise an error
response = client.post(f"/user/delete/{user_id}")
# Verify user has been deleted
assert response.status_code == 404, "Failed to simulate dummy user already deleted."

View File

@@ -1,6 +1,6 @@
"""Find clusters of interest to add more general areas of visit to the tour."""
import logging
from typing import Literal
from typing import Literal, Tuple
import numpy as np
from sklearn.cluster import DBSCAN
@@ -33,7 +33,7 @@ class Cluster(BaseModel):
"""
type: Literal['street', 'area']
importance: int
centroid: tuple
centroid: Tuple[float, float]
# start: Optional[list] = None # for later use if we want to have streets as well
# end: Optional[list] = None
@@ -134,7 +134,7 @@ class ClusterManager:
# Check that there are is least 1 cluster
if len(set(labels)) > 1 :
self.logger.info(f"Found {len(set(labels))} different {cluster_type} clusters.")
self.logger.info(f"Found {len(set(labels))} {cluster_type} clusters.")
# Separate clustered points and noise points
self.cluster_points = self.all_points[labels != -1]
self.cluster_labels = labels[labels != -1]
@@ -178,11 +178,12 @@ class ClusterManager:
# Calculate the centroid as the mean of the points
centroid = np.mean(current_cluster, axis=0)
centroid = tuple((round(centroid[0], 7), round(centroid[1], 7)))
if self.cluster_type == 'shopping' :
score = len(current_cluster)*2
score = len(current_cluster)*3
else :
score = len(current_cluster)*8
score = len(current_cluster)*15
locations.append(Cluster(
type='area',
centroid=centroid,
@@ -215,7 +216,7 @@ class ClusterManager:
"""
# Define the bounding box for a given radius around the coordinates
bbox = create_bbox(cluster.centroid, 1000)
bbox = create_bbox(cluster.centroid, 300)
# Query neighborhoods and shopping malls
selectors = ['"place"~"^(suburb|neighborhood|neighbourhood|quarter|city_block)$"']
@@ -223,10 +224,10 @@ class ClusterManager:
if self.cluster_type == 'shopping' :
selectors.append('"shop"="mall"')
new_name = 'Shopping Area'
t = 40
t = 30
else :
new_name = 'Neighborhood'
t = 15
t = 20
min_dist = float('inf')
osm_id = 0
@@ -238,7 +239,7 @@ class ClusterManager:
result = self.overpass.send_query(bbox = bbox,
osm_types = osm_types,
selector = sel,
out = 'ids center'
out = 'ids center tags'
)
except Exception as e:
self.logger.error(f"Error fetching clusters: {e}")
@@ -259,9 +260,9 @@ class ClusterManager:
d = get_distance(cluster.centroid, coords)
if d < min_dist :
min_dist = d
new_name = name
osm_type = osm_type # Add type: 'way' or 'relation'
osm_id = id # Add OSM id
new_name = name # add name
osm_type = osm_type # add type: 'way' or 'relation'
osm_id = id # add OSM id
return Landmark(
name=new_name,

View File

@@ -39,7 +39,6 @@ class LandmarkManager:
self.overall_coeff = parameters['overall_coeff']
self.tag_exponent = parameters['tag_exponent']
self.image_bonus = parameters['image_bonus']
self.name_bonus = parameters['name_bonus']
self.wikipedia_bonus = parameters['wikipedia_bonus']
self.viewpoint_bonus = parameters['viewpoint_bonus']
self.pay_bonus = parameters['pay_bonus']
@@ -147,6 +146,8 @@ class LandmarkManager:
score *= self.wikipedia_bonus
if landmark.is_place_of_worship :
score *= self.church_coeff
if landmark.is_viewpoint :
score *= self.viewpoint_bonus
if landmarktype == 'nature' :
score *= self.nature_coeff
@@ -201,7 +202,7 @@ class LandmarkManager:
return_list += self._to_landmarks(result, landmarktype, preference_level)
self.logger.debug(f"Fetched {len(return_list)} landmarks of type {landmarktype} in {bbox}")
# self.logger.debug(f"Fetched {len(return_list)} landmarks of type {landmarktype} in {bbox}")
return return_list
@@ -267,7 +268,7 @@ class LandmarkManager:
landmark.image_url = value
if key == 'website' :
landmark.website_url = value
if key == 'place_of_worship' :
if value == 'place_of_worship' :
landmark.is_place_of_worship = True
if key == 'wikipedia' :
landmark.wiki_url = value

View File

@@ -33,6 +33,7 @@ fetchTrip(
UserPreferences preferences,
) async {
Map<String, dynamic> data = {
// Add user ID here for API request
"preferences": preferences.toJson(),
"start": trip.landmarks!.first.location,
};

1091
report.html Normal file

File diff suppressed because it is too large Load Diff