273 Commits

Author SHA1 Message Date
431ae7c670 Merge pull request 'fix loki' (#54) from backend/fix-loki-dependencies into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Successful in 1m37s
/ push-to-remote (push) Failing after 46s
Build and deploy the backend to production / Deploy to production (push) Successful in 21s
Reviewed-on: #54
2025-01-24 15:45:33 +00:00
e612a82921 fixed loki + some opti changes
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m36s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 4m46s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-24 16:03:05 +01:00
163e10032c removed click
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m20s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 3m31s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-24 15:01:16 +01:00
06c01837cf ???
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m37s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 3m3s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-24 09:12:45 +01:00
cd24ee4a67 nothing changed
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m31s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 2m28s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-24 08:54:42 +01:00
85c69d5e01 installed requests pkg
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m37s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 2m59s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-24 06:42:08 +01:00
d02ba85c31 Merge pull request 'backend/better-README' (#53) from backend/better-README into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Successful in 2m19s
/ push-to-remote (push) Failing after 42s
Build and deploy the backend to production / Deploy to production (push) Successful in 23s
Reviewed-on: #53
2025-01-23 17:11:14 +00:00
0c9b829c3f more stuff
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m2s
Run linting on the backend code / Build (pull_request) Successful in 29s
Run testing on the backend code / Build (pull_request) Failing after 4m29s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 25s
2025-01-23 18:08:20 +01:00
b9d45ac9f1 better src readme 2025-01-23 18:02:57 +01:00
2f86536893 more readme 2025-01-23 17:57:17 +01:00
8d9e2d9207 Merge pull request 'backend/new-overpass' (#52) from backend/new-overpass into main
Reviewed-on: #52
2025-01-23 15:34:21 +00:00
259b0d36fd corrcected msitakes
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m48s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 4m30s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-23 16:22:41 +01:00
577ee232fc overpass as class
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m4s
Run linting on the backend code / Build (pull_request) Successful in 29s
Run testing on the backend code / Build (pull_request) Failing after 4m39s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 16:02:33 +01:00
1cc935fb34 revert json cache
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m51s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 5m41s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 22s
2025-01-23 15:33:21 +01:00
4818bde820 cleanup before prod
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m57s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 53s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 12:53:01 +01:00
b30fa1f02e ready for prod
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m25s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 4m5s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 12:37:47 +01:00
150055c1b2 better tests, ready for prod actually
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m31s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 3m58s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-23 12:28:10 +01:00
f863c41653 excellent results
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m22s
Run linting on the backend code / Build (pull_request) Successful in 49s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
Run testing on the backend code / Build (pull_request) Failing after 5m48s
2025-01-23 12:17:34 +01:00
f67e2b5dd6 better dosctrings
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m26s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 1m27s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 52s
2025-01-23 12:13:51 +01:00
28ff0460ab cleanup
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m16s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 1m14s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 12:01:49 +01:00
b9356dc4ee linting
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m51s
Run linting on the backend code / Build (pull_request) Successful in 37s
Run testing on the backend code / Build (pull_request) Failing after 3m8s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 11:28:41 +01:00
78f1dcaab4 fixed up clusters
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m28s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 1m51s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 10:33:32 +01:00
ca40de82dd working cache
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m19s
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Failing after 7m37s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-23 08:04:26 +01:00
c668158341 first homemade OSM
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m50s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 1m44s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-22 20:21:00 +01:00
98576cff0a forgot tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m9s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 10m28s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 13s
2025-01-16 14:56:32 +01:00
7027444602 linting
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m22s
Run linting on the backend code / Build (pull_request) Successful in 43s
Run testing on the backend code / Build (pull_request) Failing after 2m23s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-16 12:22:36 +01:00
e5a4645f7a faster pulp and more tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m14s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 7m45s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 24s
2025-01-16 10:15:24 +01:00
e2e54f5205 finally pulp is working !
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m31s
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Failing after 5m30s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-16 07:34:55 +01:00
2be7cd1e61 some rpoblem 2025-01-15 21:12:47 +01:00
3ebe0b7191 good starting point, working pulp 2025-01-15 19:55:48 +01:00
814da4b5f6 working pulp 2025-01-15 17:11:29 +01:00
3fe6056f3c first steps toward pulp 2025-01-15 16:48:32 +01:00
d62dddd424 forgot assertion
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m3s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 1m11s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 28s
2025-01-15 10:05:42 +01:00
133f81ce3b more print
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m23s
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Successful in 1m18s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-15 10:05:12 +01:00
14385342cc better sym breaking
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m7s
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 1m11s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-15 09:35:25 +01:00
dba988629d formatting for tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m23s
Run linting on the backend code / Build (pull_request) Successful in 26s
Run testing on the backend code / Build (pull_request) Failing after 1m13s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 23s
2025-01-15 08:58:17 +01:00
ecd505a9ce massive numpy optimization and more tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m17s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Failing after 6m58s
2025-01-14 18:23:58 +01:00
4fae658dbb better array handling in the optimizer
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m39s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Failing after 1m37s
2025-01-14 11:59:23 +01:00
41976e3e85 corrected timing
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m18s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Successful in 25s
Run testing on the backend code / Build (pull_request) Failing after 2m55s
2025-01-10 16:07:10 +01:00
73373e0fc3 linting
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m9s
Run linting on the backend code / Build (pull_request) Successful in 34s
Run testing on the backend code / Build (pull_request) Failing after 1m56s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2025-01-10 15:59:44 +01:00
c6cebd0fdf speeding up optimizer
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m30s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 25s
Run testing on the backend code / Build (pull_request) Failing after 2m0s
2025-01-10 15:46:10 +01:00
11bbf34375 better logs
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m20s
Run linting on the backend code / Build (pull_request) Failing after 26s
Run testing on the backend code / Build (pull_request) Failing after 2m40s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2025-01-09 16:58:38 +01:00
a0a3d76b78 cleaning up
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m17s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 25s
Run testing on the backend code / Build (pull_request) Failing after 1m54s
2025-01-09 09:42:11 +01:00
160059d94b Merge pull request 'some more fixes' (#49) from fix/frontend/more-pipeline-fixes into main
Reviewed-on: #49
2025-01-08 09:44:54 +00:00
18d59012cb Merge pull request 'use additional loki logger' (#48) from feature/backend/centralized-logging into main
Reviewed-on: #48
2025-01-08 09:43:36 +00:00
f297094c1a Merge pull request 'better naming and MM' (#45) from backend/fix/better-match-making into main
Reviewed-on: #45
2025-01-08 09:42:35 +00:00
86187d9069 launch adjustments
Some checks failed
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 1m28s
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m35s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2024-12-29 15:06:23 +01:00
4e07c10969 actually use fastapi lifetime manager to setup logging
Some checks failed
Run testing on the backend code / Build (pull_request) Has been cancelled
Run linting on the backend code / Build (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
2024-12-29 14:45:41 +01:00
bc63b57154 dumb type conversion
Some checks failed
Run linting on the backend code / Build (pull_request) Has been cancelled
Run testing on the backend code / Build (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m14s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 38s
2024-12-28 22:34:14 +01:00
fa083a1080 logging cleanup
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m9s
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Failing after 1m41s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 18s
2024-12-28 22:25:42 +01:00
c448e2dfb7 more verbose logger setup
Some checks failed
Run linting on the backend code / Build (pull_request) Has been cancelled
Run testing on the backend code / Build (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m32s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-28 15:52:29 +01:00
d9061388dd use additional loki logger
Some checks failed
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 1m27s
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 20s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
2024-12-28 14:01:21 +01:00
a9851f9627 some more fixes
Some checks failed
Build and release debug APK / Build APK (pull_request) Has been cancelled
2024-12-28 12:39:14 +01:00
e764393706 Some more pipeline-fixes (#46)
Some checks failed
/ push-to-remote (push) Successful in 13s
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
Reviewed-on: #46
2024-12-21 15:54:04 +00:00
a0467e1e19 higher importance for historic clusters and first time no failed test
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m52s
Run linting on the backend code / Build (pull_request) Successful in 29s
Run testing on the backend code / Build (pull_request) Successful in 2m7s
Build and release debug APK / Build APK (pull_request) Successful in 7m28s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2024-12-16 18:09:33 +01:00
9b61471c94 better naming and MM
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m22s
Run linting on the backend code / Build (pull_request) Successful in 32s
Run testing on the backend code / Build (pull_request) Failing after 2m15s
Build and release debug APK / Build APK (pull_request) Successful in 7m32s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2024-12-16 17:56:53 +01:00
a59029c809 Merge pull request 'don't use vault anymore' (#43) from frontend/ci-cd-fixes into main
All checks were successful
/ push-to-remote (push) Successful in 13s
Build and deploy the backend to production / Build and push image (push) Successful in 2m33s
Build and deploy the backend to production / Deploy to production (push) Successful in 16s
Reviewed-on: #43
2024-12-15 11:24:23 +00:00
9e0864d300 don't use vault anymore
Some checks failed
Build and release debug APK / Build APK (pull_request) Failing after 3m55s
2024-12-15 12:24:00 +01:00
3dc27b2382 Merge pull request 'Build pipeline for both platforms' (#42) from feature/frontend/ios-builds into main
Reviewed-on: #42
2024-12-14 20:49:04 +00:00
9326cf8a74 final fixes for an inital test
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m37s
Build and release debug APK / Build APK (pull_request) Failing after 3m34s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 56s
2024-12-14 21:47:41 +01:00
97cb5b16aa some more fastlane fixes
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m52s
Build and release debug APK / Build APK (pull_request) Failing after 3m50s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-14 19:54:28 +01:00
a4a70d56c6 fastlane fixes
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m23s
Build and release debug APK / Build APK (pull_request) Failing after 3m17s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-14 19:48:08 +01:00
7acfb84122 keep using match
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m42s
Build and release debug APK / Build APK (pull_request) Failing after 3m26s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-14 19:16:37 +01:00
ddd2e91328 Merge pull request 'Adding features to find public toilets and shopping streets' (#41) from feature/backend/toilets-and-shopping-streets into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Failing after 43s
Build and deploy the backend to production / Deploy to production (push) Has been skipped
/ push-to-remote (push) Failing after 13s
Reviewed-on: #41
2024-12-14 15:57:08 +00:00
1e05c5d886 smarter tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m33s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 32s
Run testing on the backend code / Build (pull_request) Failing after 1m58s
2024-12-14 16:54:55 +01:00
edd8a8b2b9 new endpoint for toilets
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m35s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 1m24s
2024-12-14 16:52:07 +01:00
cbada7e4a4 move secrets to hashicorp, don't use match (wip)
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m41s
Build and release debug APK / Build APK (pull_request) Failing after 4m46s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-14 16:39:27 +01:00
2033941953 pymemcache fixture
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m56s
Run linting on the backend code / Build (pull_request) Successful in 29s
Run testing on the backend code / Build (pull_request) Failing after 1m59s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 19s
2024-12-14 13:22:41 +01:00
4a542a4a1f switch secrets to loading from env - towards a more unified way of handling secrets
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m42s
Build and release debug APK / Build APK (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
2024-12-13 15:05:18 +01:00
291edcfc2a raise exception when not landmark in main
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m8s
Run linting on the backend code / Build (pull_request) Successful in 30s
Run testing on the backend code / Build (pull_request) Failing after 1m27s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-13 09:37:15 +01:00
b1b09ccf58 new test against cache
Some checks failed
Run testing on the backend code / Build (pull_request) Failing after 1m29s
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 41s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Successful in 27s
2024-12-13 09:01:01 +01:00
f0873ff313 more error checking
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m40s
Run linting on the backend code / Build (pull_request) Successful in 28s
Run testing on the backend code / Build (pull_request) Failing after 1m23s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-13 07:48:39 +01:00
f25355ee3e gearing up towards a working build pipeline
Some checks failed
Build and release debug APK / Build APK (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
2024-12-12 19:33:30 +01:00
edf2f3af72 lint score
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m15s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Successful in 27s
Run testing on the backend code / Build (pull_request) Failing after 1m39s
2024-12-10 16:39:45 +01:00
449e2b2ce4 more error catching
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m16s
Run linting on the backend code / Build (pull_request) Failing after 1m1s
Run testing on the backend code / Build (pull_request) Failing after 1m44s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-10 16:26:53 +01:00
4b9683a929 report
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m18s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m54s
2024-12-10 16:15:46 +01:00
6c84bab7d7 cls to self
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m17s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m29s
2024-12-07 16:04:47 +01:00
a4c435c398 better pep8
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m30s
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 1m37s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 16s
2024-12-04 19:23:26 +01:00
37fb0f2183 skipping test file for pylint
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 1m55s
2024-12-04 18:52:54 +01:00
e1727a5a49 fixed pylint disable
Some checks failed
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
Run testing on the backend code / Build (pull_request) Has been cancelled
Run linting on the backend code / Build (pull_request) Failing after 1m6s
2024-12-04 18:51:24 +01:00
fdb5f34e26 improved error handling
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m12s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 2m8s
2024-12-04 18:49:39 +01:00
7f77ecab04 cluster recognition added to backend pipeline
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 3m0s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 2m9s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-04 18:34:43 +01:00
d9be7b0707 now with better names
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m31s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 28s
Run testing on the backend code / Build (pull_request) Failing after 2m24s
2024-12-04 15:43:35 +01:00
9ddfa0393f now with names
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m45s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Failing after 1m42s
2024-12-03 16:29:30 +01:00
ef26b882b1 simplified the bbox creation
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m8s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m53s
2024-12-03 15:05:27 +01:00
06d2f4c8aa working algo for clusters
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 2m17s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Failing after 1m28s
2024-12-02 22:22:35 +01:00
4397e36be6 adding sklearn
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m48s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m13s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-01 18:15:17 +01:00
406b745592 adding data
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m8s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m10s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-12-01 18:12:18 +01:00
ced4d63382 Merge pull request 'Linting src in backend' (#40) from backend/linting-src into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Successful in 1m59s
/ push-to-remote (push) Failing after 13s
Build and deploy the backend to production / Deploy to production (push) Successful in 14s
Reviewed-on: #40
2024-12-01 17:01:32 +00:00
70a93c7143 shorter lines
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m27s
Run linting on the backend code / Build (pull_request) Failing after 36s
Run testing on the backend code / Build (pull_request) Failing after 1m25s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-11-30 18:27:52 +01:00
eec3be5122 adding pylint control to disable E0401 error
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m36s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m10s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-30 18:00:18 +01:00
41e2746d82 more testing and better pylint score
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m48s
Run linting on the backend code / Build (pull_request) Failing after 29s
Run testing on the backend code / Build (pull_request) Failing after 1m13s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 47s
2024-11-30 17:55:33 +01:00
4f169c483e first pylint correction
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m26s
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Successful in 2m12s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-11-22 11:55:25 +01:00
02e9b13a98 Merge pull request 'auto test report' (#39) from backend/auto-test-report into main
Reviewed-on: #39
2024-11-22 10:09:15 +00:00
1b955f249e skipping boundaries
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m41s
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Successful in 2m18s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-22 11:03:13 +01:00
d35ff30864 better details handling
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m31s
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Failing after 1m47s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-11-21 15:36:20 +01:00
b56647f12e pylint update and better tour viz
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m41s
Run linting on the backend code / Build (pull_request) Failing after 30s
Run testing on the backend code / Build (pull_request) Failing after 1m47s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 38s
2024-11-21 15:28:12 +01:00
a718b883d7 pylint update
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m19s
Run linting on the backend code / Build (pull_request) Failing after 26s
Run testing on the backend code / Build (pull_request) Failing after 1m45s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-21 13:55:13 +01:00
29a0c715dd corrected report path
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m9s
Run linting on the backend code / Build (pull_request) Failing after 26s
Run testing on the backend code / Build (pull_request) Successful in 1m59s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-20 17:21:54 +01:00
f939b1bd6a bellecour set 60 minutes to test
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m31s
Run linting on the backend code / Build (pull_request) Failing after 26s
Run testing on the backend code / Build (pull_request) Successful in 1m59s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 17s
2024-11-20 16:49:08 +01:00
840eb40247 auto test report
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m57s
Run linting on the backend code / Build (pull_request) Failing after 25s
Run testing on the backend code / Build (pull_request) Failing after 1m11s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 20s
2024-11-20 16:47:51 +01:00
881f6a901d Merge pull request 'fix/backend/veiwpoint-nodes-and-churches' (#38) from fix/backend/veiwpoint-nodes-and-churches into main
Reviewed-on: #38
2024-11-18 16:09:19 +00:00
2810d93f98 migrated tests
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m56s
Run linting on the backend code / Build (pull_request) Failing after 25s
Run testing on the backend code / Build (pull_request) Failing after 1m9s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-11-18 16:52:01 +01:00
4305b21329 first fixes 2024-11-18 15:39:20 +01:00
e18a9c63e6 Merge pull request 'feature/backend/better_time_management' (#34) from feature/backend/better_time_management into main
Reviewed-on: #34
2024-11-06 13:36:51 +00:00
5fcadbe8d8 extended backend readme
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m8s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-05 18:05:34 +01:00
5afb646381 Update backend/README.md
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m37s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 15s
2024-11-05 14:46:06 +00:00
d0e837377b Update backend/README.md
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m12s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-05 14:45:47 +00:00
d94c69c545 somewhat better durations
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m39s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-04 19:59:52 +01:00
9e595ad933 fixed the error
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m40s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-04 17:32:38 +01:00
53d56f3e30 remove cmakelists from vscode settings
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m41s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-11-04 16:55:23 +01:00
f39d02f967 better readme setup backend
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m15s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-10-29 12:14:12 +01:00
94a7adac6c Merge pull request 'build id fixes' (#32) from fix/frontend/yet-another-fastlane-fix into main
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Successful in 13s
Reviewed-on: #32
2024-10-22 14:24:23 +00:00
4d99715447 build id fixes
Some checks failed
Build and release debug APK / Build APK (pull_request) Has been cancelled
2024-10-22 16:23:53 +02:00
48555e7429 Merge pull request 'adjust path' (#31) from fix/frontend/ci-adjustments into main
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Failing after 11s
Reviewed-on: #31
2024-10-22 13:52:32 +00:00
8b24876fd1 adjust path
Some checks failed
Build and release debug APK / Build APK (pull_request) Has been cancelled
2024-10-22 15:52:03 +02:00
c832461f29 Merge pull request 'fixes for fastlane and gitea actions' (#30) from fix/frontend/ci-adjustments into main
Reviewed-on: #30
2024-10-22 13:26:02 +00:00
6f1a019d4f fixes for fastlane and gitea actions
All checks were successful
Build and release debug APK / Build APK (pull_request) Successful in 7m36s
Build and deploy the backend to production / Build and push image (push) Successful in 1m44s
/ push-to-remote (push) Successful in 12s
Build and deploy the backend to production / Deploy to production (push) Successful in 15s
2024-10-22 15:11:38 +02:00
e6ccb7078b Merge pull request 'also upload aab' (#29) from fix/frontend/fastlane-config into main
Some checks are pending
Build and deploy the backend to production / Deploy to production (push) Blocked by required conditions
Build and deploy the backend to production / Build and push image (push) Waiting to run
/ push-to-remote (push) Successful in 12s
Reviewed-on: #29
2024-10-22 12:51:22 +00:00
84839c5a02 also upload aab
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
2024-10-22 14:50:59 +02:00
9850e949c3 Merge pull request 'remove unneeded screenshots' (#28) from frontend/fix/remove-screenshots into main
Some checks failed
Build and deploy the backend to production / Build and push image (push) Has been cancelled
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
/ push-to-remote (push) Successful in 13s
Reviewed-on: #28
2024-10-22 09:54:44 +00:00
5fc25a3c39 remove unneeded screenshots
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
2024-10-22 11:52:59 +02:00
f76cd603f3 Merge pull request 'Better landmark finding' (#27) from fix/backend/moore-tags into main
All checks were successful
Build and deploy the backend to production / Build and push image (push) Successful in 1m38s
/ push-to-remote (push) Successful in 13s
Build and deploy the backend to production / Deploy to production (push) Successful in 14s
Reviewed-on: #27
2024-10-22 09:20:53 +00:00
67f748244f update deployment config
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
2024-10-15 10:30:20 +02:00
66fa55e8c3 remove boring bridges
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m40s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-10-14 19:10:31 +00:00
f4729a2de7 fix pipfile mismatch
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m8s
Build and release APK / Build APK (pull_request) Failing after 5m41s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 14s
2024-10-09 23:41:58 +02:00
40edd923c3 remove geopy dependency
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 53s
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been skipped
Build and release APK / Build APK (pull_request) Failing after 6m44s
2024-10-09 16:12:41 +02:00
6ad749eeed fix? bug where landmarks are returned as none 2024-10-09 16:10:40 +02:00
c20ebf3d63 add more tags and filter more restrictively
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m57s
Build and release APK / Build APK (pull_request) Failing after 5m40s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 19s
2024-10-01 16:10:52 +02:00
6facde6e0b Merge pull request 'UI and UX fixes to make new trips (and the errors it might have less confusing)' (#26) from fix/frontend/greeter-ui into main
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Successful in 16s
Reviewed-on: #26
2024-09-30 16:09:10 +00:00
55b0a1b793 make ui at least usable throughout
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
2024-09-30 18:07:27 +02:00
39df97f4d1 fix multiline secrets by using base64 2024-09-30 18:07:17 +02:00
43aa26a107 small kubeconfig dum dum
All checks were successful
Build and deploy the backend to production / Build and push image (push) Successful in 1m43s
/ push-to-remote (push) Successful in 12s
Build and deploy the backend to production / Deploy to production (push) Successful in 13s
2024-09-27 10:41:04 +02:00
badb8ff919 Merge pull request 'ensure attractiveness is always an int' (#25) from fix/backend/pydantic-issues into main
Reviewed-on: #25
2024-09-27 08:28:37 +00:00
290baec64e associated backend fixes
Some checks failed
Build and deploy the backend to staging / Deploy to staging (pull_request) Has been cancelled
Build and deploy the backend to staging / Build and push image (pull_request) Has been cancelled
Build and release APK / Build APK (pull_request) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Successful in 1m39s
/ push-to-remote (push) Successful in 12s
Build and deploy the backend to production / Deploy to production (push) Failing after 13s
2024-09-27 10:28:06 +02:00
3710a476e8 don't break when using sets
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m45s
Build and deploy the backend to staging / Deploy to staging (pull_request) Failing after 13s
2024-09-27 10:13:33 +02:00
cdc9b0ecd1 ensure attractiveness is always an int
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m4s
Build and deploy the backend to staging / Deploy to staging (pull_request) Failing after 18s
2024-09-27 09:47:10 +02:00
097abc5f29 remove gradle again
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Successful in 12s
2024-09-25 17:49:08 +02:00
55f8c938fb cleanup 2024-09-25 16:50:34 +02:00
59f3f0d454 update screenshots
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Successful in 41s
2024-09-25 16:20:33 +02:00
06dc0c7c3b Merge pull request 'Usability and styling' (#24) from feature/frontend-usability-and-styling into main
Some checks failed
Build and deploy the backend to production / Deploy to production (push) Has been cancelled
Build and deploy the backend to production / Build and push image (push) Has been cancelled
/ push-to-remote (push) Successful in 12s
Reviewed-on: #24
2024-09-25 13:26:19 +00:00
d37fb09d62 trip destination from current location
Some checks failed
Build and release APK / Build APK (pull_request) Failing after 5m54s
2024-09-25 15:10:49 +02:00
88b825ea31 better visual coherence 2024-09-25 14:48:33 +02:00
d323194ea7 Better location handling on map 2024-09-24 23:48:07 +02:00
ed60fcba06 revamp new trip flow 2024-09-24 22:58:28 +02:00
eaa2334942 ensure new images are pulled upon redeployment 2024-09-22 15:13:32 +02:00
f71aab22dc Merge branch 'feature/adding-timed-visits'
Some checks failed
Build and deploy the backend to production / Build and push image (push) Successful in 2m3s
/ push-to-remote (push) Failing after 22s
Build and deploy the backend to production / Deploy to production (push) Successful in 12s
2024-09-22 15:04:56 +02:00
1f7b006a64 Merge pull request 'Fix deployment configuration' (#23) from test/backend-deploy-pipeline into main
All checks were successful
Build and deploy the backend to production / Build and push image (push) Successful in 1m49s
/ push-to-remote (push) Successful in 12s
Build and deploy the backend to production / Deploy to production (push) Successful in 12s
Reviewed-on: #23
2024-09-22 12:31:20 +00:00
e5ea6e64e7 fix ingress
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 2m13s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 13s
2024-09-21 16:04:35 +02:00
9deb461925 a bit of documentation
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m43s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 13s
2024-09-21 15:10:48 +02:00
7414f7109d updated deployment config
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m38s
Build and deploy the backend to staging / Deploy to staging (pull_request) Failing after 12s
2024-09-21 14:56:40 +02:00
94d12f2983 even more fixes
All checks were successful
Build and deploy the backend to staging / Build and push image (pull_request) Successful in 1m38s
Build and deploy the backend to staging / Deploy to staging (pull_request) Successful in 9s
2024-09-21 14:22:28 +02:00
0e3f56a131 must be this format?
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 39s
Build and deploy the backend to staging / Deploy to production (pull_request) Failing after 8s
2024-09-21 13:14:11 +02:00
2f901e008e slight workflow test
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 0s
Build and deploy the backend to staging / Deploy to production (pull_request) Failing after 0s
2024-09-21 13:12:56 +02:00
f45dccff97 just a test
Some checks failed
Build and deploy the backend to staging / Build and push image (pull_request) Failing after 0s
Build and deploy the backend to staging / Deploy to production (pull_request) Failing after 0s
2024-09-21 13:09:27 +02:00
da85b91975 a better backend deployment workflow 2024-09-21 13:07:18 +02:00
156661bfba fix remote repo pushing 2024-09-20 14:56:08 +02:00
205ec37764 Merge branch 'feature/frontend/deploy-to-stores'
Some checks failed
/ push-to-remote (push) Failing after 3s
2024-09-18 16:11:56 +02:00
db821b4856 cleanup and documentation
Some checks failed
Build and release APK / Build APK (pull_request) Failing after 9m4s
2024-09-18 16:07:20 +02:00
b63c50ebd0 Merge pull request 'display more landmark information' (#21) from feature/frontend-more-landmark-information into main
Reviewed-on: #21
2024-09-18 13:07:53 +00:00
d4066a1726 cleanup of application to satisfy google requirements 2024-09-18 15:06:13 +02:00
7617c5788c launcher icon handling 2024-09-18 14:04:16 +02:00
1ade92aed3 better timing
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m49s
Build and release APK / Build APK (pull_request) Successful in 5m3s
2024-09-10 18:12:17 +02:00
d1c53e08bb improved tour length
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m39s
Build and release APK / Build APK (pull_request) Successful in 5m14s
2024-09-10 17:23:01 +02:00
797db5c1da set start and finish as primary
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m34s
Build and release APK / Build APK (pull_request) Successful in 5m3s
2024-09-10 17:11:36 +02:00
055089bf76 better landmarks log
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m5s
Build and release APK / Build APK (pull_request) Successful in 4m46s
2024-09-10 17:07:26 +02:00
3475990e5f fixed timing and optimizer speed
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m49s
Build and release APK / Build APK (pull_request) Successful in 5m40s
2024-09-10 16:53:16 +02:00
7cbf5a037c display more landmark information
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m49s
2024-09-10 16:15:08 +02:00
20f20da9c5 fixed int score
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m43s
Build and release APK / Build APK (pull_request) Successful in 6m36s
2024-09-10 10:16:01 +02:00
b11f082803 more automation 2024-09-10 10:12:53 +02:00
0ae20e4995 use fastlane to deploy android app 2024-09-06 08:26:44 +02:00
728b954643 added image and website urls
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m15s
Build and release APK / Build APK (pull_request) Successful in 6m40s
2024-08-28 18:08:09 +02:00
83d83b03a9 Merge pull request 'fix memcache usage' (#18) from fix/backend-usable-memcached into main
Reviewed-on: #18
2024-08-24 15:45:47 +00:00
87ef37f0a9 Merge branch 'feature/frontend-onboarding' 2024-08-24 17:45:07 +02:00
5912f562c3 Merge pull request 'location picker and ui fixes' (#17) from feature/frontend/location-picker into main
Reviewed-on: #17
2024-08-24 15:36:10 +00:00
eaecc27305 onboarding screen proof of concept
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m47s
2024-08-17 13:12:46 +02:00
003b8d0f9c better time management for optimizer 2024-08-12 18:52:01 +02:00
a1fcc8d23b added duration
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m30s
2024-08-12 16:07:53 +02:00
da921171e9 more balanced scores
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m41s
2024-08-12 15:58:30 +02:00
d24bc2470b navigation intent gets opened correctly
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m50s
2024-08-11 16:06:20 +02:00
6d3399640e use more fitting floating action button, cleanup
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m24s
2024-08-10 17:14:56 +02:00
e951032f11 fix memcache usage
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m37s
2024-08-10 15:12:39 +02:00
22ca038017 trip loading (from device storage) much improved
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m18s
2024-08-09 11:39:11 +02:00
311b1c2218 location picker and ui fixes
All checks were successful
Build and release APK / Build APK (pull_request) Successful in 5m25s
2024-08-09 00:48:45 +02:00
bea3a65fec Merge pull request 'Tentatively enable communication between front + backend' (#15) from feature/frontend-backend-interoperability into main
Reviewed-on: #15
2024-08-06 13:51:31 +00:00
f71b9b19a6 show correct landmark types when fetching from api
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m42s
Build and release APK / Build APK (pull_request) Successful in 5m22s
2024-08-06 14:34:12 +02:00
89511f39cb better errorhandling, slimmed down optimizer
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m40s
Build and release APK / Build APK (pull_request) Successful in 5m26s
2024-08-05 16:03:29 +02:00
71d9554d97 ui improvements for trips and landmarks
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m48s
Build and release APK / Build APK (pull_request) Successful in 4m51s
2024-08-05 10:18:00 +02:00
c87a01b2e8 overhaul using a trip struct that notifies its ui dependencies
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m54s
Build and release APK / Build APK (pull_request) Successful in 5m32s
2024-08-03 17:17:48 +02:00
5748630b99 bare implementation of comuncation with the api
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m6s
Build and release APK / Build APK (pull_request) Successful in 4m32s
2024-08-01 22:48:28 +02:00
016622c7af frontend compliant with backend
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m38s
Build and release APK / Build APK (pull_request) Successful in 5m11s
2024-08-01 19:35:25 +02:00
bf129b201d more straightforward logging 2024-08-01 19:34:48 +02:00
86bcec6b29 frontend groundwork
Some checks failed
Build and release APK / Build APK (pull_request) Has been cancelled
Build and push docker image / Build (pull_request) Successful in 1m33s
2024-08-01 17:56:06 +02:00
07dde5ab58 persistence for recurring api calls
All checks were successful
Build and push docker image / Build (pull_request) Successful in 1m48s
2024-07-31 12:54:25 +02:00
db82495f11 rename frontend components to anyway 2024-07-30 22:49:28 +02:00
889b6c2096 added wikidata throttle to gitignore 2024-07-27 21:05:13 +02:00
35e0f3c400 Merge pull request 'Add readmes' (#12) from feature/documentation into main
Reviewed-on: remoll/fast-network-navigation#12
2024-07-27 12:10:12 +00:00
de5f1ec3d3 Merge pull request 'cleanup-backend' (#13) from cleanup-backend into main
Reviewed-on: remoll/fast-network-navigation#13
2024-07-27 12:09:54 +00:00
81c763587d Merge pull request 'style corrections, documentation, duplicate removal, flow improvement' (#11) from feature/backend-refactoring into cleanup-backend
Some checks failed
Build and push docker image / Build (pull_request) Failing after 10s
Reviewed-on: remoll/fast-network-navigation#11
2024-07-27 12:09:10 +00:00
3fa689fd16 a few docker-related fixes 2024-07-26 19:11:26 +02:00
2736a89f70 cleanup in view of docker builds 2024-07-26 13:13:36 +02:00
e50eedf099 add readmes with first pointers
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m15s
Build and release APK / Build APK (pull_request) Successful in 6m46s
Build web / Build Web (pull_request) Successful in 2m33s
2024-07-25 17:22:29 +02:00
2863c99d7c style corrections, documentation, duplicate removal, flow improvement 2024-07-25 17:15:18 +02:00
80b3d5b012 refactored landmark manager and clean up 2024-07-25 09:37:37 +02:00
d23050b811 fixed parameters folder 2024-07-24 10:34:34 +02:00
127ba8c028 fixed log 2024-07-21 10:16:13 +02:00
94fa735d54 cleaned up backend to use classes and yaml files 2024-07-20 23:16:35 +02:00
14a7f555df more coherent base types 2024-07-17 13:22:43 +02:00
4a291a69c9 Merge branch 'feature/backend/initial-deployment' 2024-07-17 12:47:20 +02:00
7d7a25e2f3 stage changes as reference implementation
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m50s
Build and release APK / Build APK (pull_request) Successful in 4m11s
Build web / Build Web (pull_request) Successful in 1m28s
2024-07-17 12:35:08 +02:00
f590ebb5ed Merge pull request 'Permafix-optimization-refiner' (#9) from Permafix-optimization-refiner into main
Reviewed-on: remoll/fast-network-navigation#9
2024-07-17 10:32:32 +00:00
b09ec2b083 changed bbox to meters
All checks were successful
Build and push docker image / Build (pull_request) Successful in 3m45s
Build and release APK / Build APK (pull_request) Successful in 6m57s
Build web / Build Web (pull_request) Successful in 2m33s
2024-07-17 12:30:47 +02:00
8d71cab8d5 osmnx does not behave 2024-07-17 12:00:40 +02:00
87df2f70e9 cleanup files 2024-07-17 11:59:42 +02:00
50bc8150c8 permafixed ? 2024-07-16 09:01:58 +02:00
4466f29a3d better map style 2024-07-08 12:21:22 +02:00
25cc0fa300 (theoretically) functional deployment 2024-07-08 11:55:27 +02:00
8f23a4747d further cleanup 2024-07-08 11:55:00 +02:00
4896e95617 cleaned up 2024-07-08 02:13:13 +02:00
30ed2bb9ed permafixed the optimizer ??? 2024-07-08 02:01:42 +02:00
568e7bfbc4 upgraded optimizer 2024-07-08 01:20:17 +02:00
d4e964c5d4 fixed the optimizer_v4 2024-07-07 16:24:15 +02:00
f9c86261cb switch to osmnx 2024-07-07 14:49:10 +02:00
e71c92da40 added some ideas 2024-07-07 10:17:50 +02:00
006b80018a Added 2024-07-05 17:21:47 +02:00
49ce8527a3 cleanup path handling for easier dockerization 2024-06-30 18:42:59 +02:00
bec1827891 Corrected optimizer and landmark attributes in backend 2024-06-26 14:23:51 +02:00
8e33bd1b3f base structs as agreed upon 2024-06-26 12:27:54 +02:00
c26d9222bd Merge branch 'feature/backend/unify-api-communication' 2024-06-26 11:05:57 +02:00
09bcd95cab space
Some checks failed
Build web / Build Web (pull_request) Has been cancelled
Build and release APK / Build APK (pull_request) Has been cancelled
Build and push docker image / Build (pull_request) Has been cancelled
2024-06-26 10:52:51 +02:00
fdcaaf8c16 updated refiner
Some checks failed
Build and push docker image / Build (pull_request) Successful in 3m17s
Build and release APK / Build APK (pull_request) Has been cancelled
Build web / Build Web (pull_request) Has been cancelled
2024-06-26 10:52:24 +02:00
8d068c80a7 Upgraded refiner
All checks were successful
Build and push docker image / Build (pull_request) Successful in 3m8s
Build and release APK / Build APK (pull_request) Successful in 5m0s
Build web / Build Web (pull_request) Successful in 1m41s
2024-06-24 11:06:01 +02:00
b6c9e61be9 Merge pull request 'UI elements using the new structs' (#8) from feature/unify-api-frontend into main
Reviewed-on: remoll/fast-network-navigation#8
2024-06-23 19:21:48 +00:00
eede94add4 working save and load functionality with custom datastructures
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m8s
Build and release APK / Build APK (pull_request) Successful in 5m15s
Build web / Build Web (pull_request) Successful in 1m13s
2024-06-23 21:19:06 +02:00
813c83a81d removed amentiy=arts_centre
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m10s
Build and release APK / Build APK (pull_request) Successful in 5m16s
Build web / Build Web (pull_request) Successful in 1m39s
2024-06-23 18:22:14 +02:00
fd378d6289 fixed refiner
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m36s
Build and release APK / Build APK (pull_request) Successful in 5m31s
Build web / Build Web (pull_request) Successful in 1m15s
2024-06-23 12:08:51 +02:00
34922a2645 Added refiner (for minor landmarks)
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m17s
Build and release APK / Build APK (pull_request) Successful in 5m13s
Build web / Build Web (pull_request) Successful in 1m14s
2024-06-23 10:50:44 +02:00
db41528702 functional datastructure. Needs to be able to write to storage as well
Some checks failed
Build and push docker image / Build (pull_request) Failing after 3m5s
Build and release APK / Build APK (pull_request) Successful in 7m24s
Build web / Build Web (pull_request) Successful in 3m36s
2024-06-21 19:30:40 +02:00
1f5bd92895 added pep8 example
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m59s
Build and release APK / Build APK (pull_request) Successful in 4m47s
Build web / Build Web (pull_request) Successful in 1m42s
2024-06-19 14:58:11 +02:00
111e6836f6 reviewed code structure, cleaned comments, now pep8 conform
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m17s
Build and release APK / Build APK (pull_request) Successful in 6m53s
Build web / Build Web (pull_request) Successful in 1m31s
2024-06-11 20:14:12 +02:00
af4d68f36f fixed optimizer and added a member to Landmark class
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m16s
Build and release APK / Build APK (pull_request) Successful in 5m28s
Build web / Build Web (pull_request) Successful in 1m23s
2024-06-11 10:46:09 +02:00
53a5a9e873 fixed duplicate landmarks
All checks were successful
Build and push docker image / Build (pull_request) Successful in 3m4s
Build and release APK / Build APK (pull_request) Successful in 5m44s
Build web / Build Web (pull_request) Successful in 2m14s
2024-06-10 22:52:08 +02:00
adbb6466d9 fixed optimizer. works fine now
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m17s
Build and release APK / Build APK (pull_request) Successful in 5m56s
Build web / Build Web (pull_request) Successful in 1m15s
2024-06-10 14:24:37 +02:00
9a5ae95d97 landmark styling
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m47s
Build and release APK / Build APK (pull_request) Successful in 4m39s
Build web / Build Web (pull_request) Successful in 1m21s
2024-06-07 15:09:18 +02:00
40943c5c5b finally use correct api key 2024-06-07 15:06:33 +02:00
040e5c9f83 cleaner ci
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m10s
Build and release APK / Build APK (pull_request) Successful in 6m34s
Build web / Build Web (pull_request) Successful in 1m42s
2024-06-07 10:44:37 +02:00
c58c10b057 fixed input as coordinates
Some checks failed
Build and push docker image / Build (pull_request) Failing after 36s
Build and release APK / Build APK (pull_request) Successful in 5m20s
Build web / Build Web (pull_request) Successful in 1m13s
2024-06-04 00:20:54 +02:00
d5e0b7d51a Beginning to use different contexts
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m49s
Build and release APK / Build APK (pull_request) Successful in 5m48s
Build web / Build Web (pull_request) Successful in 1m32s
2024-06-03 13:51:01 +02:00
ebaec40d6b Update frontend/lib/pages/profile.dart
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m3s
Build and release APK / Build APK (pull_request) Successful in 5m18s
Build web / Build Web (pull_request) Successful in 1m12s
2024-06-02 19:38:46 +00:00
c1d74ab227 use structs to draw custom map pointers
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m10s
Build and release APK / Build APK (pull_request) Successful in 5m22s
Build web / Build Web (pull_request) Successful in 1m14s
2024-06-01 19:58:25 +02:00
8bc7da0b3e first ui elements using the new structs
Some checks failed
Build and push docker image / Build (pull_request) Failing after 41s
Build and release APK / Build APK (pull_request) Successful in 5m25s
Build web / Build Web (pull_request) Successful in 1m17s
2024-05-31 21:33:04 +02:00
bcc91c638d Merge branch 'feature/backend/unify-api-communication' of ssh://git.kluster.moll.re:2222/remoll/fast-network-navigation into feature/backend/unify-api-communication
All checks were successful
Build and push docker image / Build (pull_request) Successful in 2m8s
Build and release APK / Build APK (pull_request) Successful in 5m14s
Build web / Build Web (pull_request) Successful in 1m34s
2024-05-30 00:50:56 +02:00
d88f22121e started to implement overpass queries 2024-05-30 00:48:38 +02:00
beee9614c5 Update .gitea/workflows/backed_build-image.yaml
All checks were successful
Build and push docker image / Build (pull_request) Successful in 3m4s
Build and release APK / Build APK (pull_request) Successful in 4m49s
Build web / Build Web (pull_request) Successful in 1m51s
2024-05-29 22:37:22 +00:00
03da8441f2 cleaned up folders and defined proper structs
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m14s
Build and release APK / Build APK (pull_request) Successful in 5m15s
Build web / Build Web (pull_request) Successful in 2m32s
2024-05-29 22:57:11 +02:00
7e4538a1bf Merge branch 'feature/use-google-maps'
Some checks failed
Build and push docker image / Build (pull_request) Failing after 2m11s
Build and release APK / Build APK (pull_request) Successful in 5m9s
Build web / Build Web (pull_request) Successful in 1m26s
2024-05-29 18:25:06 +02:00
80b04905de Merge branch 'feature/optimizer-use-fastapi' 2024-05-29 18:23:34 +02:00
ae9860cc8e proper fast api suage
Some checks failed
Test code / Test code (push) Has been cancelled
Build and release APK / Build APK (pull_request) Has been cancelled
Build web / Build Web (pull_request) Has been cancelled
Test code / Test code (pull_request) Has been cancelled
2024-05-29 18:14:40 +02:00
3029fb8537 some preference improvements
Some checks failed
Build and release APK / Build APK (pull_request) Successful in 7m27s
Test code / Test code (push) Has been cancelled
Build web / Build Web (pull_request) Has been cancelled
Test code / Test code (pull_request) Has been cancelled
2024-05-25 18:55:58 +02:00
f292b2a375 Update .gitea/workflows/backed_build-image.yaml 2024-05-25 12:23:22 +00:00
6dc5d560ae Merge pull request 'Automated docker builds' (#5) from feature/docker-cicd into main
Reviewed-on: remoll/fast-network-navigation#5
2024-05-25 12:20:19 +00:00
cb33f315a0 final building cleanup
Some checks failed
Build web / Build Web (pull_request) Has been cancelled
Build and release APK / Build APK (pull_request) Has been cancelled
Build and push docker image / Build (pull_request) Failing after 1m42s
2024-05-25 14:19:57 +02:00
656c606493 first try of automated docker builds
Some checks failed
Test code / Test code (push) Has been cancelled
Build web / Build Web (pull_request) Has been cancelled
Test code / Test code (pull_request) Has been cancelled
Build and push docker image / Build (pull_request) Successful in 1m29s
Build and release APK / Build APK (pull_request) Successful in 8m31s
2024-05-25 13:44:17 +02:00
99dc8b5e67 added OSM api 2024-05-23 12:20:32 +02:00
d8d425a922 added dataclass
Some checks failed
Test code / Test code (push) Failing after 3s
Build and release APK / Build APK (pull_request) Has been cancelled
Build web / Build Web (pull_request) Has been cancelled
Test code / Test code (pull_request) Has been cancelled
2024-05-23 10:39:14 +02:00
70cebc0aa1 fixed return order
Some checks failed
Test code / Test code (push) Failing after 2s
2024-05-22 15:12:54 +02:00
101af0ebc6 included fastapi
Some checks failed
Test code / Test code (push) Failing after 3s
2024-05-22 15:01:46 +02:00
2b31ce5f6b Cleanup and created main
Some checks failed
Test code / Test code (push) Failing after 2s
2024-05-22 10:16:33 +02:00
82a864e57f fixed circular symmetry
Some checks failed
Test code / Test code (push) Failing after 2s
2024-05-22 00:40:57 +02:00
07830de1b2 Update .gitea/workflows/frontend_build-android.yaml
Some checks failed
Test code / Test code (push) Failing after 1s
Test code / Test code (pull_request) Failing after 0s
Build web / Build Web (pull_request) Has been cancelled
Build and release APK / Build APK (pull_request) Successful in 8m7s
2024-05-21 06:29:10 +00:00
3688229d7b Update frontend/android/app/src/main/AndroidManifest.xml
Some checks failed
Test code / Test code (push) Waiting to run
Build and release APK / Build APK (pull_request) Successful in 5m50s
Build web / Build Web (pull_request) Failing after 1s
Test code / Test code (pull_request) Failing after 0s
2024-05-20 22:18:08 +00:00
6405f33a34 update flutter builder
Some checks failed
Test code / Test code (push) Waiting to run
Build and release APK / Build APK (pull_request) Failing after 5m18s
Build web / Build Web (pull_request) Failing after 0s
Test code / Test code (pull_request) Failing after 0s
2024-05-20 21:38:20 +02:00
e600f40c1a corrected symmetry cosntraint
Some checks failed
Test code / Test code (push) Failing after 26s
2024-05-20 01:37:35 +02:00
195 changed files with 12687 additions and 1636 deletions

View File

@@ -0,0 +1,24 @@
on:
push:
tags:
- v*
name: Build and deploy the backend to production
jobs:
build-and-push:
name: Build and push image
uses: ./.gitea/workflows/workflow_build-image.yaml
with:
tag: stable
secrets:
PACKAGE_REGISTRY_ACCESS: ${{ secrets.PACKAGE_REGISTRY_ACCESS }}
deploy-prod:
name: Deploy to production
uses: ./.gitea/workflows/workflow_deploy-container.yaml
with:
overlay: prod
secrets:
KUBE_CONFIG: ${{ secrets.KUBE_CONFIG }}
needs: build-and-push

View File

@@ -0,0 +1,26 @@
on:
pull_request:
branches:
- main
paths:
- backend/**
name: Build and deploy the backend to staging
jobs:
build-and-push:
name: Build and push image
uses: ./.gitea/workflows/workflow_build-image.yaml
with:
tag: unstable
secrets:
PACKAGE_REGISTRY_ACCESS: ${{ secrets.PACKAGE_REGISTRY_ACCESS }}
deploy-prod:
name: Deploy to staging
uses: ./.gitea/workflows/workflow_deploy-container.yaml
with:
overlay: stg
secrets:
KUBE_CONFIG: ${{ secrets.KUBE_CONFIG }}
needs: build-and-push

View File

@@ -0,0 +1,32 @@
on:
pull_request:
branches:
- main
paths:
- backend/**
name: Run linting on the backend code
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: https://gitea.com/actions/checkout@v4
- name: Install dependencies
run: |
apt-get update && apt-get install -y python3 python3-pip
pip install pipenv
- name: Install packages
run: |
ls -la
# only install dev-packages
pipenv install --categories=dev-packages
working-directory: backend
- name: Run linter
run: pipenv run pylint src --fail-under=9
working-directory: backend

View File

@@ -0,0 +1,39 @@
on:
pull_request:
branches:
- main
paths:
- backend/**
name: Run testing on the backend code
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: https://gitea.com/actions/checkout@v4
- name: Install dependencies
run: |
apt-get update && apt-get install -y python3 python3-pip
pip install pipenv
- name: Install packages
run: |
ls -la
# install all packages, including dev-packages
pipenv install --dev
working-directory: backend
- name: Run Tests
run: pipenv run pytest src --html=report.html --self-contained-html --log-cli-level=INFO
working-directory: backend
- name: Upload HTML report
if: always()
uses: https://gitea.com/actions/upload-artifact@v3
with:
name: pytest-html-report
path: backend/report.html

View File

@@ -2,8 +2,11 @@ on:
pull_request:
branches:
- main
paths:
- frontend/**
name: Build and release APK
name: Build and release debug APK
jobs:
build:
@@ -25,7 +28,7 @@ jobs:
distribution: 'zulu'
- name: Fix flutter SDK folder permission
run: git config --global --add safe.directory /opt/hostedtoolcache/flutter/--
run: git config --global --add safe.directory "*"
- uses: https://github.com/subosito/flutter-action@v2
with:
@@ -39,21 +42,26 @@ jobs:
- run: flutter pub get
working-directory: ./frontend
- run: flutter build apk --release --split-per-abi
- name: Add required secrets
env:
ANDROID_SECRETS_PROPERTIES: ${{ secrets.ANDROID_SECRETS_PROPERTIES }}
run: |
echo "$ANDROID_SECRETS_PROPERTIES" >> ./android/secrets.properties
working-directory: ./frontend
- name: Sanity check
run: |
ls
ls -lah android
working-directory: ./frontend
- name: Release APK
uses: https://gitea.com/akkuman/gitea-release-action@v1
- run: flutter build apk --debug --split-per-abi --build-number=${{ gitea.run_number }}
working-directory: ./frontend
- name: Upload APKs to artifacts
uses: https://gitea.com/actions/upload-artifact@v3
with:
files: ./frontendbuild/app/outputs/flutter-apk/*.apk
name: Testing release
release_name: testing
tag: testing
tag_name: testing
release_body: "This is a testing release."
prerelease: true
token: ${{ secrets.GITEA_TOKEN }}
env:
NODE_OPTIONS: '--experimental-fetch'
name: app-release
path: frontend/build/app/outputs/flutter-apk/
if-no-files-found: error
retention-days: 15

View File

@@ -1,32 +1,34 @@
on:
pull_request:
branches:
- main
name: Build web
jobs:
build:
name: Build Web
runs-on: k8s
steps:
- name: Install prerequisites
run: |
sudo apt-get update
sudo apt-get install -y xz-utils
- uses: actions/checkout@v4
- uses: https://github.com/subosito/flutter-action@v2
with:
channel: stable
flutter-version: 3.19.6
cache: true
- run: flutter pub get
working-directory: ./frontend
# on:
# pull_request:
# branches:
# - main
# paths:
# - frontend/**
- run: flutter build web
working-directory: ./frontend
# name: Build web
# jobs:
# build:
# name: Build Web
# runs-on: ubuntu-latest
# steps:
# - name: Install prerequisites
# run: |
# sudo apt-get update
# sudo apt-get install -y xz-utils
# - uses: actions/checkout@v4
# - uses: https://github.com/subosito/flutter-action@v2
# with:
# channel: stable
# flutter-version: 3.19.6
# cache: true
# - run: flutter pub get
# working-directory: ./frontend
# - run: flutter build web
# working-directory: ./frontend

View File

@@ -1,33 +0,0 @@
on:
push:
pull_request:
branches:
- main
name: Test code
jobs:
test:
name: Test code
runs-on: k8s
steps:
- name: Install prerequisites
run: |
sudo apt-get update
sudo apt-get install -y xz-utils
- uses: actions/checkout@v4
- uses: https://github.com/subosito/flutter-action@v2
with:
channel: stable
flutter-version: 3.19.6
cache: true
- run: flutter pub get
working-directory: ./frontend
- run: flutter test
working-directory: ./frontend

View File

@@ -0,0 +1,39 @@
on:
push:
tags:
- v*
jobs:
push-to-remote:
# We want to use the macos runner provided by github actions. This requires to push to a remote first.
# After the push we can use the action under frontend/.github/actions/ to deploy properly using fastlane on macos.
runs-on: ubuntu-latest
steps:
- name: Checkout code
uses: actions/checkout@v3
with:
path: 'src'
- name: Checkout remote repository
uses: actions/checkout@v3
with:
path: 'dest'
ref: 'main'
github-server-url: 'https://github.com'
repository: 'moll-re/anyway-frontend-builder'
token: ${{ secrets.PUSH_GITHUB_API_TOKEN }}
fetch-depth: 0
persist-credentials: true
- name: Copy files to remote repository
run: cp -r src/frontend/. dest/
- name: Commit and push changes
run: |
cd dest
git config --global user.email "me@moll.re"
git config --global user.name "[bot]"
git add .
git commit -m "Automatic code update for tag"
git tag -a ${{ github.ref_name }} -m "mirrored tag"
git push origin main --tags

View File

@@ -0,0 +1,38 @@
on:
workflow_call:
inputs:
tag:
required: true
type: string
secrets:
PACKAGE_REGISTRY_ACCESS:
required: true
name: Build and push docker image
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- uses: https://gitea.com/actions/checkout@v4
- name: Login to Docker Registry
uses: docker/login-action@v3
with:
registry: git.kluster.moll.re
username: ${{ gitea.repository_owner }}
password: ${{ secrets.PACKAGE_REGISTRY_ACCESS }}
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Build and push
uses: docker/build-push-action@v5
with:
context: backend
tags: git.kluster.moll.re/anydev/anyway-backend:${{ inputs.tag }}
push: true

View File

@@ -0,0 +1,35 @@
on:
workflow_call:
inputs:
overlay:
required: true
type: string
secrets:
KUBE_CONFIG:
required: true
name: Deploy the newly built container
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
steps:
- uses: https://gitea.com/actions/checkout@v4
with:
submodules: true
- name: setup kubectl
uses: https://github.com/azure/setup-kubectl@v4
- name: Set kubeconfig
run: |
echo "${{ secrets.KUBE_CONFIG }}" > kubeconfig
- name: Deploy to k8s
run: |
kubectl apply -k backend/deployment/overlays/${{ inputs.overlay }} --kubeconfig=kubeconfig
kubectl -n anyway-backend rollout restart deployment/anyway-backend-${{ inputs.overlay }} --kubeconfig=kubeconfig

1
.gitignore vendored Normal file
View File

@@ -0,0 +1 @@
cache/

3
.gitmodules vendored Normal file
View File

@@ -0,0 +1,3 @@
[submodule "backend/deployment"]
path = backend/deployment
url = https://git.kluster.moll.re/anydev/anyway-backend-deployment

50
.vscode/launch.json vendored Normal file
View File

@@ -0,0 +1,50 @@
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
// backend - python using fastapi
{
"name": "Backend - debug",
"type": "debugpy",
"request": "launch",
"env": {
"DEBUG": "true"
},
"jinja": true,
"cwd": "${workspaceFolder}/backend",
"module": "fastapi",
"args": [
"dev",
"src/main.py"
]
},
{
"name": "Backend - tester",
"type": "debugpy",
"request": "launch",
"program": "src/tester.py",
"env": {
"DEBUG": "true"
},
"cwd": "${workspaceFolder}/backend"
},
// frontend - flutter app
{
"name": "Frontend - debug",
"type": "dart",
"request": "launch",
"program": "lib/main.dart",
"cwd": "${workspaceFolder}/frontend"
},
{
"name": "Frontend - profile",
"type": "dart",
"request": "launch",
"program": "lib/main.dart",
"flutterMode": "profile",
"cwd": "${workspaceFolder}/frontend"
}
]
}

View File

@@ -1,16 +1,40 @@
# fast_network_navigation
# AnyWay - plan city trips your way
AnyWay is a mobile application that helps users plan city trips. The app allows users to specify their preferences and constraints, and then generates a personalized itinerary for them. The planning follows some guiding principles:
- **Personalization**:The user's preferences should be reflected in the choice of destinations.
- **Efficiency**:The itinerary should be optimized for the user's constraints.
- **Flexibility**: We aknowledge that tourism is a dynamic activity, and that users may want to change their plans on the go.
- **Discoverability**: Tourism is an inherently exploratory activity. Once a rough itinerary is generated, detours and spontaneous decisions should be encouraged.
## Architecture
This project is divided into two main components: a frontend and a backend. The frontend is a Flutter application that runs on Android and iOS. The backend is a python server that runs on a cloud provider. The two components communicate via a REST API. Since both components are very interdependent and share many data structures, we opted for a monorepo approach.
### Frontend
See the [frontend README](frontend/README.md) for more information. The application is centered around its map view, which displays the user's itinerary. This is based on the Google Maps API.
### Backend
See the [backend README](backend/README.md) for more information. The backend is responsible for generating the itinerary based on the user's preferences and constraints. Rather than using google maps, we use the OpenStreetMap database through the Overpass API, which is much more flexible.
A new Flutter project.
## Getting Started
Refer to the READMEs in the `frontend` and `backend` directories for instructions on how to run the application locally. Notable prerequisites include:
- Flutter SDK
- `google_maps_flutter` plugin
- Python 3
- `fastapi`
- `numpy`
- `pydantic`
- Docker
This project is a starting point for a Flutter application.
A few resources to get you started if this is your first Flutter project:
## Releases
The project is still in its early stages. We are currently working on a prototype that demonstrates the core functionality of the application.
- [Lab: Write your first Flutter app](https://docs.flutter.dev/get-started/codelab)
- [Cookbook: Useful Flutter samples](https://docs.flutter.dev/cookbook)
Official releases will be made available on the Google Play Store and the Apple App Store.
For help getting started with Flutter development, view the
[online documentation](https://docs.flutter.dev/), which offers tutorials,
samples, guidance on mobile development, and a full API reference.
## Roadmap
See the [project board](https://todos.kluster.moll.re/share/L8vZJGU45Vx9RzzVqTzyCPrpRybXm1OAqaW7YkWb/auth?view=list):
<iframe src="https://todos.kluster.moll.re/share/L8vZJGU45Vx9RzzVqTzyCPrpRybXm1OAqaW7YkWb/auth?view=list" title="Todos" width="100%">

165
backend/.gitignore vendored Normal file
View File

@@ -0,0 +1,165 @@
# osm-cache
cache_XML/
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
share/python-wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.nox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
*.py,cover
.hypothesis/
.pytest_cache/
cover/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
db.sqlite3-journal
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
.pybuilder/
target/
# Jupyter Notebook
.ipynb_checkpoints
# IPython
profile_default/
ipython_config.py
# pyenv
# For a library or package, you might want to ignore these files since the code is
# intended to run in multiple environments; otherwise, check them in:
# .python-version
# pipenv
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
# However, in case of collaboration, if having platform-specific dependencies or dependencies
# having no cross-platform support, pipenv may install dependencies that don't work, or not
# install all needed dependencies.
#Pipfile.lock
# poetry
# Similar to Pipfile.lock, it is generally recommended to include poetry.lock in version control.
# This is especially recommended for binary packages to ensure reproducibility, and is more
# commonly ignored for libraries.
# https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control
#poetry.lock
# pdm
# Similar to Pipfile.lock, it is generally recommended to include pdm.lock in version control.
#pdm.lock
# pdm stores project-wide configurations in .pdm.toml, but it is recommended to not include it
# in version control.
# https://pdm.fming.dev/latest/usage/project/#working-with-version-control
.pdm.toml
.pdm-python
.pdm-build/
# PEP 582; used by e.g. github.com/David-OConnor/pyflow and github.com/pdm-project/pdm
__pypackages__/
# Celery stuff
celerybeat-schedule
celerybeat.pid
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/
.dmypy.json
dmypy.json
# Pyre type checker
.pyre/
# pytype static type analyzer
.pytype/
# Cython debug symbols
cython_debug/
# PyCharm
# JetBrains specific template is maintained in a separate JetBrains.gitignore that can
# be found at https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore
# and can be added to the global gitignore or merged into this file. For a more nuclear
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/

654
backend/.pylintrc Normal file
View File

@@ -0,0 +1,654 @@
[MAIN]
# Analyse import fallback blocks. This can be used to support both Python 2 and
# 3 compatible code, which means that the block might have code that exists
# only in one or another interpreter, leading to false positives when analysed.
analyse-fallback-blocks=no
# Clear in-memory caches upon conclusion of linting. Useful if running pylint
# in a server-like mode.
clear-cache-post-run=no
# Load and enable all available extensions. Use --list-extensions to see a list
# all available extensions.
#enable-all-extensions=
# In error mode, messages with a category besides ERROR or FATAL are
# suppressed, and no reports are done by default. Error mode is compatible with
# disabling specific errors.
#errors-only=
# Always return a 0 (non-error) status code, even if lint errors are found.
# This is primarily useful in continuous integration scripts.
#exit-zero=
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code.
extension-pkg-allow-list=
# A comma-separated list of package or module names from where C extensions may
# be loaded. Extensions are loading into the active Python interpreter and may
# run arbitrary code. (This is an alternative name to extension-pkg-allow-list
# for backward compatibility.)
extension-pkg-whitelist=
# Return non-zero exit code if any of these messages/categories are detected,
# even if score is above --fail-under value. Syntax same as enable. Messages
# specified are enabled, while categories only check already-enabled messages.
fail-on=
# Specify a score threshold under which the program will exit with error.
fail-under=10
# Interpret the stdin as a python script, whose filename needs to be passed as
# the module_or_package argument.
#from-stdin=
# Files or directories to be skipped. They should be base names, not paths.
ignore=CVS
# Add files or directories matching the regular expressions patterns to the
# ignore-list. The regex matches against paths and can be in Posix or Windows
# format. Because '\\' represents the directory delimiter on Windows systems,
# it can't be used as an escape character.
ignore-paths=
# Files or directories matching the regular expression patterns are skipped.
# The regex matches against base names, not paths. The default value ignores
# Emacs file locks
ignore-patterns=^\.#
# List of module names for which member attributes should not be checked and
# will not be imported (useful for modules/projects where namespaces are
# manipulated during runtime and thus existing member attributes cannot be
# deduced by static analysis). It supports qualified module names, as well as
# Unix pattern matching.
ignored-modules=
# Python code to execute, usually for sys.path manipulation such as
# pygtk.require().
#init-hook=
# Use multiple processes to speed up Pylint. Specifying 0 will auto-detect the
# number of processors available to use, and will cap the count on Windows to
# avoid hangs.
jobs=1
# Control the amount of potential inferred values when inferring a single
# object. This can help the performance when dealing with large functions or
# complex, nested conditions.
limit-inference-results=100
# List of plugins (as comma separated values of python module names) to load,
# usually to register additional checkers.
load-plugins=
# Pickle collected data for later comparisons.
persistent=yes
# Resolve imports to .pyi stubs if available. May reduce no-member messages and
# increase not-an-iterable messages.
prefer-stubs=no
# Minimum Python version to use for version dependent checks. Will default to
# the version used to run pylint.
py-version=3.12
# Discover python modules and packages in the file system subtree.
recursive=no
# Add paths to the list of the source roots. Supports globbing patterns. The
# source root is an absolute path or a path relative to the current working
# directory used to determine a package namespace for modules located under the
# source root.
source-roots=
# When enabled, pylint would attempt to guess common misconfiguration and emit
# user-friendly hints instead of false-positive error messages.
suggestion-mode=yes
# Allow loading of arbitrary C extensions. Extensions are imported into the
# active Python interpreter and may run arbitrary code.
unsafe-load-any-extension=no
# In verbose mode, extra non-checker-related info will be displayed.
#verbose=
[BASIC]
# Naming style matching correct argument names.
argument-naming-style=snake_case
# Regular expression matching correct argument names. Overrides argument-
# naming-style. If left empty, argument names will be checked with the set
# naming style.
#argument-rgx=
# Naming style matching correct attribute names.
attr-naming-style=snake_case
# Regular expression matching correct attribute names. Overrides attr-naming-
# style. If left empty, attribute names will be checked with the set naming
# style.
#attr-rgx=
# Bad variable names which should always be refused, separated by a comma.
bad-names=foo,
bar,
baz,
toto,
tutu,
tata
# Bad variable names regexes, separated by a comma. If names match any regex,
# they will always be refused
bad-names-rgxs=
# Naming style matching correct class attribute names.
class-attribute-naming-style=any
# Regular expression matching correct class attribute names. Overrides class-
# attribute-naming-style. If left empty, class attribute names will be checked
# with the set naming style.
#class-attribute-rgx=
# Naming style matching correct class constant names.
class-const-naming-style=UPPER_CASE
# Regular expression matching correct class constant names. Overrides class-
# const-naming-style. If left empty, class constant names will be checked with
# the set naming style.
#class-const-rgx=
# Naming style matching correct class names.
class-naming-style=PascalCase
# Regular expression matching correct class names. Overrides class-naming-
# style. If left empty, class names will be checked with the set naming style.
#class-rgx=
# Naming style matching correct constant names.
const-naming-style=UPPER_CASE
# Regular expression matching correct constant names. Overrides const-naming-
# style. If left empty, constant names will be checked with the set naming
# style.
#const-rgx=
# Minimum line length for functions/classes that require docstrings, shorter
# ones are exempt.
docstring-min-length=-1
# Naming style matching correct function names.
function-naming-style=snake_case
# Regular expression matching correct function names. Overrides function-
# naming-style. If left empty, function names will be checked with the set
# naming style.
#function-rgx=
# Good variable names which should always be accepted, separated by a comma.
good-names=i,
j,
k,
ex,
Run,
_
# Good variable names regexes, separated by a comma. If names match any regex,
# they will always be accepted
good-names-rgxs=
# Include a hint for the correct naming format with invalid-name.
include-naming-hint=no
# Naming style matching correct inline iteration names.
inlinevar-naming-style=any
# Regular expression matching correct inline iteration names. Overrides
# inlinevar-naming-style. If left empty, inline iteration names will be checked
# with the set naming style.
#inlinevar-rgx=
# Naming style matching correct method names.
method-naming-style=snake_case
# Regular expression matching correct method names. Overrides method-naming-
# style. If left empty, method names will be checked with the set naming style.
#method-rgx=
# Naming style matching correct module names.
module-naming-style=snake_case
# Regular expression matching correct module names. Overrides module-naming-
# style. If left empty, module names will be checked with the set naming style.
#module-rgx=
# Colon-delimited sets of names that determine each other's naming style when
# the name regexes allow several styles.
name-group=
# Regular expression which should only match function or class names that do
# not require a docstring.
no-docstring-rgx=^_
# List of decorators that produce properties, such as abc.abstractproperty. Add
# to this list to register other decorators that produce valid properties.
# These decorators are taken in consideration only for invalid-name.
property-classes=abc.abstractproperty
# Regular expression matching correct type alias names. If left empty, type
# alias names will be checked with the set naming style.
#typealias-rgx=
# Regular expression matching correct type variable names. If left empty, type
# variable names will be checked with the set naming style.
#typevar-rgx=
# Naming style matching correct variable names.
variable-naming-style=snake_case
# Regular expression matching correct variable names. Overrides variable-
# naming-style. If left empty, variable names will be checked with the set
# naming style.
#variable-rgx=
[CLASSES]
# Warn about protected attribute access inside special methods
check-protected-access-in-special-methods=no
# List of method names used to declare (i.e. assign) instance attributes.
defining-attr-methods=__init__,
__new__,
setUp,
asyncSetUp,
__post_init__
# List of member names, which should be excluded from the protected access
# warning.
exclude-protected=_asdict,_fields,_replace,_source,_make,os._exit
# List of valid names for the first argument in a class method.
valid-classmethod-first-arg=cls
# List of valid names for the first argument in a metaclass class method.
valid-metaclass-classmethod-first-arg=mcs
[DESIGN]
# List of regular expressions of class ancestor names to ignore when counting
# public methods (see R0903)
exclude-too-few-public-methods=
# List of qualified class names to ignore when counting class parents (see
# R0901)
ignored-parents=
# Maximum number of arguments for function / method.
max-args=5
# Maximum number of attributes for a class (see R0902).
max-attributes=20
# Maximum number of boolean expressions in an if statement (see R0916).
max-bool-expr=5
# Maximum number of branch for function / method body.
max-branches=12
# Maximum number of locals for function / method body.
max-locals=30
# Maximum number of parents for a class (see R0901).
max-parents=7
# Maximum number of positional arguments for function / method.
max-positional-arguments=5
# Maximum number of public methods for a class (see R0904).
max-public-methods=20
# Maximum number of return / yield for function / method body.
max-returns=6
# Maximum number of statements in function / method body.
max-statements=50
# Minimum number of public methods for a class (see R0903).
min-public-methods=2
[EXCEPTIONS]
# Exceptions that will emit a warning when caught.
overgeneral-exceptions=builtins.BaseException,builtins.Exception
[FORMAT]
# Expected format of line ending, e.g. empty (any line ending), LF or CRLF.
expected-line-ending-format=
# Regexp for a line that is allowed to be longer than the limit.
ignore-long-lines=^\s*(# )?<?https?://\S+>?$
# Number of spaces of indent required inside a hanging or continued line.
indent-after-paren=4
# String used as indentation unit. This is usually " " (4 spaces) or "\t" (1
# tab).
indent-string=' '
# Maximum number of characters on a single line.
max-line-length=105
# Maximum number of lines in a module.
max-module-lines=1000
# Allow the body of a class to be on the same line as the declaration if body
# contains single statement.
single-line-class-stmt=no
# Allow the body of an if to be on the same line as the test if there is no
# else.
single-line-if-stmt=no
[IMPORTS]
# List of modules that can be imported at any level, not just the top level
# one.
allow-any-import-level=
# Allow explicit reexports by alias from a package __init__.
allow-reexport-from-package=no
# Allow wildcard imports from modules that define __all__.
allow-wildcard-with-all=no
# Deprecated modules which should not be used, separated by a comma.
deprecated-modules=
# Output a graph (.gv or any supported image format) of external dependencies
# to the given file (report RP0402 must not be disabled).
ext-import-graph=
# Output a graph (.gv or any supported image format) of all (i.e. internal and
# external) dependencies to the given file (report RP0402 must not be
# disabled).
import-graph=
# Output a graph (.gv or any supported image format) of internal dependencies
# to the given file (report RP0402 must not be disabled).
int-import-graph=
# Force import order to recognize a module as part of the standard
# compatibility libraries.
known-standard-library=
# Force import order to recognize a module as part of a third party library.
known-third-party=enchant
# Couples of modules and preferred modules, separated by a comma.
preferred-modules=
[LOGGING]
# The type of string formatting that logging methods do. `old` means using %
# formatting, `new` is for `{}` formatting.
logging-format-style=new
# Logging modules to check that the string format arguments are in logging
# function parameter format.
logging-modules=logging
[MESSAGES CONTROL]
# Only show warnings with the listed confidence levels. Leave empty to show
# all. Valid levels: HIGH, CONTROL_FLOW, INFERENCE, INFERENCE_FAILURE,
# UNDEFINED.
confidence=HIGH,
CONTROL_FLOW,
INFERENCE,
INFERENCE_FAILURE,
UNDEFINED
# Disable the message, report, category or checker with the given id(s). You
# can either give multiple identifiers separated by comma (,) or put this
# option multiple times (only on the command line, not in the configuration
# file where it should appear only once). You can also use "--disable=all" to
# disable everything first and then re-enable specific checks. For example, if
# you want to run only the similarities checker, you can use "--disable=all
# --enable=similarities". If you want to run only the classes checker, but have
# no Warning level messages displayed, use "--disable=all --enable=classes
# --disable=W".
disable=raw-checker-failed,
bad-inline-option,
locally-disabled,
file-ignored,
suppressed-message,
useless-suppression,
deprecated-pragma,
use-symbolic-message-instead,
use-implicit-booleaness-not-comparison-to-string,
use-implicit-booleaness-not-comparison-to-zero,
import-error,
multiple-statements,
line-too-long,
logging-fstring-interpolation,
duplicate-code,
relative-beyond-top-level,
invalid-name
# Enable the message, report, category or checker with the given id(s). You can
# either give multiple identifier separated by comma (,) or put this option
# multiple time (only on the command line, not in the configuration file where
# it should appear only once). See also the "--disable" option for examples.
enable=
[METHOD_ARGS]
# List of qualified names (i.e., library.method) which require a timeout
# parameter e.g. 'requests.api.get,requests.api.post'
timeout-methods=requests.api.delete,requests.api.get,requests.api.head,requests.api.options,requests.api.patch,requests.api.post,requests.api.put,requests.api.request
[MISCELLANEOUS]
# List of note tags to take in consideration, separated by a comma.
notes=FIXME,
XXX,
TODO
# Regular expression of note tags to take in consideration.
notes-rgx=
[REFACTORING]
# Maximum number of nested blocks for function / method body
max-nested-blocks=5
# Complete name of functions that never returns. When checking for
# inconsistent-return-statements if a never returning function is called then
# it will be considered as an explicit return statement and no message will be
# printed.
never-returning-functions=sys.exit,argparse.parse_error
# Let 'consider-using-join' be raised when the separator to join on would be
# non-empty (resulting in expected fixes of the type: ``"- " + " -
# ".join(items)``)
suggest-join-with-non-empty-separator=yes
[REPORTS]
# Python expression which should return a score less than or equal to 10. You
# have access to the variables 'fatal', 'error', 'warning', 'refactor',
# 'convention', and 'info' which contain the number of messages in each
# category, as well as 'statement' which is the total number of statements
# analyzed. This score is used by the global evaluation report (RP0004).
evaluation=max(0, 0 if fatal else 10.0 - ((float(5 * error + warning + refactor + convention) / statement) * 10))
# Template used to display messages. This is a python new-style format string
# used to format the message information. See doc for all details.
msg-template=
# Set the output format. Available formats are: text, parseable, colorized,
# json2 (improved json format), json (old json format) and msvs (visual
# studio). You can also give a reporter class, e.g.
# mypackage.mymodule.MyReporterClass.
#output-format=
# Tells whether to display a full report or only the messages.
reports=no
# Activate the evaluation score.
score=yes
[SIMILARITIES]
# Comments are removed from the similarity computation
ignore-comments=yes
# Docstrings are removed from the similarity computation
ignore-docstrings=yes
# Imports are removed from the similarity computation
ignore-imports=yes
# Signatures are removed from the similarity computation
ignore-signatures=yes
# Minimum lines number of a similarity.
min-similarity-lines=4
[SPELLING]
# Limits count of emitted suggestions for spelling mistakes.
max-spelling-suggestions=4
# Spelling dictionary name. No available dictionaries : You need to install
# both the python package and the system dependency for enchant to work.
spelling-dict=
# List of comma separated words that should be considered directives if they
# appear at the beginning of a comment and should not be checked.
spelling-ignore-comment-directives=fmt: on,fmt: off,noqa:,noqa,nosec,isort:skip,mypy:
# List of comma separated words that should not be checked.
spelling-ignore-words=
# A path to a file that contains the private dictionary; one word per line.
spelling-private-dict-file=
# Tells whether to store unknown words to the private dictionary (see the
# --spelling-private-dict-file option) instead of raising a message.
spelling-store-unknown-words=no
[STRING]
# This flag controls whether inconsistent-quotes generates a warning when the
# character used as a quote delimiter is used inconsistently within a module.
check-quote-consistency=no
# This flag controls whether the implicit-str-concat should generate a warning
# on implicit string concatenation in sequences defined over several lines.
check-str-concat-over-line-jumps=no
[TYPECHECK]
# List of decorators that produce context managers, such as
# contextlib.contextmanager. Add to this list to register other decorators that
# produce valid context managers.
contextmanager-decorators=contextlib.contextmanager
# List of members which are set dynamically and missed by pylint inference
# system, and so shouldn't trigger E1101 when accessed. Python regular
# expressions are accepted.
generated-members=
# Tells whether to warn about missing members when the owner of the attribute
# is inferred to be None.
ignore-none=yes
# This flag controls whether pylint should warn about no-member and similar
# checks whenever an opaque object is returned when inferring. The inference
# can return multiple potential results while evaluating a Python object, but
# some branches might not be evaluated, which results in partial inference. In
# that case, it might be useful to still emit no-member and other checks for
# the rest of the inferred objects.
ignore-on-opaque-inference=yes
# List of symbolic message names to ignore for Mixin members.
ignored-checks-for-mixins=no-member,
not-async-context-manager,
not-context-manager,
attribute-defined-outside-init
# List of class names for which member attributes should not be checked (useful
# for classes with dynamically set attributes). This supports the use of
# qualified names.
ignored-classes=optparse.Values,thread._local,_thread._local,argparse.Namespace
# Show a hint with possible names when a member name was not found. The aspect
# of finding the hint is based on edit distance.
missing-member-hint=yes
# The minimum edit distance a name should have in order to be considered a
# similar match for a missing member name.
missing-member-hint-distance=1
# The total number of similar names that should be taken in consideration when
# showing a hint for a missing member.
missing-member-max-choices=1
# Regex pattern to define which classes are considered mixins.
mixin-class-rgx=.*[Mm]ixin
# List of decorators that change the signature of a decorated function.
signature-mutators=
[VARIABLES]
# List of additional names supposed to be defined in builtins. Remember that
# you should avoid defining new builtins when possible.
additional-builtins=
# Tells whether unused global variables should be treated as a violation.
allow-global-unused-variables=yes
# List of names allowed to shadow builtins
allowed-redefined-builtins=
# List of strings which can identify a callback function by name. A callback
# name must start or end with one of those strings.
callbacks=cb_,
_cb
# A regular expression matching the name of dummy variables (i.e. expected to
# not be used).
dummy-variables-rgx=_+$|(_[a-zA-Z0-9_]*[a-zA-Z0-9]+?$)|dummy|^ignored_|^unused_
# Argument names that match this expression will be ignored.
ignored-argument-names=_.*|^ignored_|^unused_
# Tells whether we should check for unused import in __init__ files.
init-import=no
# List of qualified module names which can have objects that can redefine
# builtins.
redefining-builtins-modules=six.moves,past.builtins,future.builtins,builtins,io

View File

@@ -1,11 +1,20 @@
FROM python:3
FROM python:3.11-slim
WORKDIR /app
COPY Pipfile Pipfile.lock /app/
COPY Pipfile Pipfile.lock .
RUN pip install pipenv
RUN pipenv install --deploy --ignore-pipfile
RUN pipenv install --deploy --system
COPY . /src
COPY src src
CMD ["pipenv", "run", "python", "/app/src/main.py"]
EXPOSE 8000
# Set environment variables used by the deployment. These can be overridden by the user using this image.
ENV NUM_WORKERS=1
ENV OSM_CACHE_DIR=/cache
ENV MEMCACHED_HOST_PATH=none
ENV LOKI_URL=none
# explicitly use a string instead of an argument list to force a shell and variable expansion
CMD fastapi run src/main.py --port 8000 --workers $NUM_WORKERS

View File

@@ -1,14 +1,27 @@
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
numpy = "*"
scipy = "*"
fastapi = "*"
[dev-packages]
[requires]
python_version = "3.12"
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[dev-packages]
pylint = "*"
pytest = "*"
tomli = "*"
httpx = "*"
exceptiongroup = "*"
pytest-html = "*"
typing-extensions = "*"
dill = "*"
[packages]
numpy = "*"
fastapi = "*"
pydantic = "*"
shapely = "*"
pymemcache = "*"
fastapi-cli = "*"
scikit-learn = "*"
loki-logger-handler = "*"
pulp = "*"
scipy = "*"
requests = "*"

1782
backend/Pipfile.lock generated

File diff suppressed because it is too large Load Diff

56
backend/README.md Normal file
View File

@@ -0,0 +1,56 @@
# Backend
This repository contains the backend code for the application. It utilizes **FastAPI** to quickly create a RESTful API that exposes the endpoints of the route optimizer.
## Getting Started
### Directory Structure
- The code for the Python application is located in the `src` directory.
- Package management is handled with **pipenv**, and the dependencies are listed in the `Pipfile`.
- Since the application is designed to be deployed in a container, the `Dockerfile` is provided to build the image.
### Setting Up the Development Environment
To set up your development environment using **pipenv**, follow these steps:
1. Install `pipenv` by running:
```bash
sudo apt install pipenv
```
2. Create and activate a virtual environment:
```bash
pipenv shell
```
3. Install the dependencies listed in the `Pipfile`:
```bash
pipenv install
```
4. The virtual environment will be created under:
```bash
~/.local/share/virtualenvs/...
```
### Deployment
To deploy the backend docker container, we use kubernetes. Modifications to the backend are automatically pushed to a two-stage environment through the CI pipeline. See [deployment/README](deployment/README.md] for further information.
The deployment configuration is included as a submodule in the `deployment` directory. The standalone repository is under [https://git.kluster.moll.re/anydev/anyway-backend-deployment/](https://git.kluster.moll.re/anydev/anyway-backend-deployment/).
## Development
The backend application is structured around the `src` directory, which contains the core components for handling route optimization and API logic. Development generally involves working with key modules such as the optimization engine, Overpass API integration, and utilities for managing landmarks and trip data.
### Key Areas:
- **API Endpoints**: The main interaction with the backend is through the endpoints defined in `src/main.py`. FastAPI simplifies the creation of RESTful services that manage trip and landmark data.
- **Optimization Logic**: The trip optimization and refinement are handled in the `src/optimization` module. This is where the core algorithms are implemented.
- **Landmark Management**: Fetching and prioritizing points of interest (POIs) based on user preferences happens in `src/utils/LandmarkManager`.
- **Testing**: The `src/tests` directory includes tests in various scenarii, ensuring that the logic works as expected.
For detailed information, refer to the [src README](backend/src/README.md).
### Running the Application:
To run the backend locally, ensure that the virtual environment is activated and all dependencies are installed as outlined in the "Getting Started" section. You can start the FastAPI server with:
```bash
uvicorn src.main:app --reload

47
backend/conftest.py Normal file
View File

@@ -0,0 +1,47 @@
import pytest
pytest_plugins = ["pytest_html"]
def pytest_html_report_title(report):
"""modifying the title of html report"""
report.title = "Backend Testing Report"
def pytest_html_results_table_header(cells):
cells.insert(2, "<th>Detailed trip</th>")
cells.insert(3, "<th>Trip Duration</th>")
cells.insert(4, "<th>Target Duration</th>")
cells[5] = "<th>Execution time</th>" # rename the column containing execution times to avoid confusion
def pytest_html_results_table_row(report, cells):
trip_details = getattr(report, "trip_details", "N/A") # Default to "N/A" if no trip data
trip_duration = getattr(report, "trip_duration", "N/A") # Default to "N/A" if no trip data
target_duration = getattr(report, "target_duration", "N/A") # Default to "N/A" if no trip data
cells.insert(2, f"<td>{trip_details}</td>")
cells.insert(3, f"<td>{trip_duration}</td>")
cells.insert(4, f"<td>{target_duration}</td>")
@pytest.hookimpl(hookwrapper=True)
def pytest_runtest_makereport(item, call):
outcome = yield
report = outcome.get_result()
report.description = str(item.function.__doc__)
# Attach trip_details if it exists
if hasattr(item, "trip_details"):
report.trip_details = " - ".join(item.trip_details) # Convert list to string
else:
report.trip_details = "N/A" # Default if trip_string is not set
# Attach trip_duration if it exists
if hasattr(item, "trip_duration"):
report.trip_duration = item.trip_duration + " min"
else:
report.trip_duration = "N/A" # Default if duration is not set
# Attach target_duration if it exists
if hasattr(item, "target_duration"):
report.target_duration = item.target_duration + " min"
else:
report.target_duration = "N/A" # Default if duration is not set

1
backend/deployment Submodule

Submodule backend/deployment added at 904f16bfc0

1094
backend/report.html Normal file

File diff suppressed because one or more lines are too long

65
backend/src/README.md Normal file
View File

@@ -0,0 +1,65 @@
# Overview of backend/src
This project is structured into several components that handle different aspects of the application's functionality. Below is a high-level overview of each folder and the key Python files in the |src| directory.
## Folders
### src/optimization
This folder contains modules related to the optimization algorithm used to compute the optimal trip. It comprises the optimizer for the first rough trip and a refiner to include less famous landmarks as well.
### src/overpass
This folder handles interactions with the Overpass API, including constructing and sending queries, caching responses, and parsing results from the Overpass database.
### src/parameters
The modules in this folder define and manage parameters for various parts of the application. This includes configuration values for the optimizer or the list of selectors for Overpass queries.
### src/structs
This folder defines the commonly used data structures used within the project. The models leverage Pydantic's `BaseModel` to ensure data validation, serialization, and easy interaction between different components of the application. The main classes are:
- **Landmark**:
- Represents a point of interest in the context of a trip. It stores various attributes like the landmark's name, type, location (latitude and longitude), and its OSM details.
- It also includes other optional fields like image URLs, website links, and descriptions. Additionally, the class has properties to track its attractiveness score or elative importance.
- **Preferences**:
- This class captures user-defined preferences needed to personalize a trip. Preferences are provided for sightseeing (history and culture), nature (parks and gardens), and shopping. These preferences guide the trip optimization process.
- **Trip**:
- The `Trip` class represents the complete travel plan generated by the system. It holds key information like the trip's total time and the first landmark's UUID.
### src/tests
This folder contains unit tests and test cases for the application's various modules. It is used to ensure the correctness and stability of the code.
### src/utils
The `utils` folder contains utility classes and functions that provide core functionality for the application. The main component in this folder is the `LandmarkManager`, which is central to the process of fetching and organizing landmarks.
- **LandmarkManager**:
- The `LandmarkManager` is responsible for fetching landmarks from OpenStreetMap (via the Overpass API) and managing their classification based on user preferences. It processes raw geographical data, filters landmarks into relevant categories (such as sightseeing, nature, shopping), and prioritizes them for trip planning.
## Files
### src/cache.py
This file manages the caching mechanisms used throughout the application. It defines the caching strategy for storing and retrieving data, improving the performance of repeated operations by avoiding redundant API calls or computations.
### src/constants.py
This module defines global constants used throughout the project. These constants may include API endpoints, fixed configuration values, or reusable strings and integers that need to remain consistent.
### src/logging_config.py
This file configures the logging system for the application. It defines how logs are formatted, where they are output (e.g., console or file), and the logging levels (e.g., debug, info, error).
### src/main.py
This file contains the main application logic and API endpoints for interacting with the system. The application is built using the FastAPI framework, which provides several endpoints for creating trips, fetching trips, and retrieving landmarks or nearby facilities. The key endpoints include:
- **POST /trip/new**:
- This endpoint allows users to create a new trip by specifying preferences, start coordinates, and optionally end coordinates. The preferences guide the optimization process for selecting landmarks.
- Returns: A `Trip` object containing the optimized route, landmarks, and trip details.
- **GET /trip/{trip_uuid}**:
- This endpoint fetches an already generated trip by its unique identifier (`trip_uuid`). It retrieves the trip data from the cache.
- Returns: A `Trip` object corresponding to the given `trip_uuid`.
- **GET /landmark/{landmark_uuid}**:
- This endpoint retrieves a specific landmark by its unique identifier (`landmark_uuid`) from the cache.
- Returns: A `Landmark` object containing the details of the requested landmark.
- **POST /toilets/new**:
- This endpoint searches for public toilets near a specified location within a given radius. The location and radius are passed as query parameters.
- Returns: A list of `Toilets` objects located within the specified radius of the provided coordinates.

75
backend/src/cache.py Normal file
View File

@@ -0,0 +1,75 @@
"""Module used for handling cache"""
from pymemcache import serde
from pymemcache.client.base import Client
from .constants import MEMCACHED_HOST_PATH
class DummyClient:
"""
A dummy in-memory client that mimics the behavior of a memcached client.
This class is designed to simulate the behavior of the `pymemcache.Client`
for testing or development purposes. It stores data in a Python dictionary
and provides methods to set, get, and update key-value pairs.
Attributes:
_data (dict): A dictionary that holds the key-value pairs.
Methods:
set(key, value, **kwargs):
Stores the given key-value pair in the internal dictionary.
set_many(data, **kwargs):
Updates the internal dictionary with multiple key-value pairs.
get(key, **kwargs):
Retrieves the value associated with the given key from the internal
dictionary.
"""
_data = {}
def set(self, key, value, **kwargs): # pylint: disable=unused-argument
"""
Store a key-value pair in the internal dictionary.
Args:
key: The key for the item to be stored.
value: The value to be stored under the given key.
**kwargs: Additional keyword arguments (unused).
"""
self._data[key] = value
def set_many(self, data, **kwargs): # pylint: disable=unused-argument
"""
Update the internal dictionary with multiple key-value pairs.
Args:
data: A dictionary containing key-value pairs to be added.
**kwargs: Additional keyword arguments (unused).
"""
self._data.update(data)
def get(self, key, **kwargs): # pylint: disable=unused-argument
"""
Retrieve the value associated with the given key.
Args:
key: The key for the item to be retrieved.
**kwargs: Additional keyword arguments (unused).
Returns:
The value associated with the given key if it exists.
"""
return self._data[key]
if MEMCACHED_HOST_PATH is None:
client = DummyClient()
else:
client = Client(
MEMCACHED_HOST_PATH,
timeout=1,
allow_unicode_keys=True,
encoding='utf-8',
serde=serde.pickle_serde
)

20
backend/src/constants.py Normal file
View File

@@ -0,0 +1,20 @@
"""Module setting global parameters for the application such as cache, route generation, etc."""
import os
from pathlib import Path
LOCATION_PREFIX = Path('src')
PARAMETERS_DIR = LOCATION_PREFIX / 'parameters'
AMENITY_SELECTORS_PATH = PARAMETERS_DIR / 'amenity_selectors.yaml'
LANDMARK_PARAMETERS_PATH = PARAMETERS_DIR / 'landmark_parameters.yaml'
OPTIMIZER_PARAMETERS_PATH = PARAMETERS_DIR / 'optimizer_parameters.yaml'
cache_dir_string = os.getenv('OSM_CACHE_DIR', './cache')
OSM_CACHE_DIR = Path(cache_dir_string)
MEMCACHED_HOST_PATH = os.getenv('MEMCACHED_HOST_PATH', None)
if MEMCACHED_HOST_PATH == "none":
MEMCACHED_HOST_PATH = None

View File

@@ -0,0 +1,56 @@
"""Sets up global logging configuration for the application."""
import logging
import os
logger = logging.getLogger(__name__)
def configure_logging():
"""
Called at startup of a FastAPI application instance to setup logging. Depending on the environment, it will log to stdout or to Loki.
"""
is_debug = os.getenv('DEBUG', "false") == "true"
is_kubernetes = os.getenv('KUBERNETES_SERVICE_HOST') is not None
if is_kubernetes:
# in that case we want to log to stdout and also to loki
from loki_logger_handler.loki_logger_handler import LokiLoggerHandler
loki_url = os.getenv('LOKI_URL')
if loki_url is None:
raise ValueError("LOKI_URL environment variable is not set")
loki_handler = LokiLoggerHandler(
url = loki_url,
labels = {'app': 'anyway', 'environment': 'staging' if is_debug else 'production'}
)
logger.info(f"Logging to Loki at {loki_url} with {loki_handler.labels} and {is_debug=}")
logging_handlers = [loki_handler, logging.StreamHandler()]
logging_level = logging.DEBUG if is_debug else logging.INFO
# silence the chatty logs loki generates itself
logging.getLogger('urllib3.connectionpool').setLevel(logging.WARNING)
# no need for time since it's added by loki or can be shown in kube logs
logging_format = '%(name)s - %(levelname)s - %(message)s'
else:
# if we are in a debug (local) session, set verbose and rich logging
from rich.logging import RichHandler
logging_handlers = [RichHandler()]
logging_level = logging.DEBUG if is_debug else logging.INFO
logging_format = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
logging.basicConfig(
level = logging_level,
format = logging_format,
handlers = logging_handlers
)
# also overwrite the uvicorn loggers
logging.getLogger('uvicorn').handlers = logging_handlers
logging.getLogger('uvicorn.access').handlers = logging_handlers
logging.getLogger('uvicorn.error').handlers = logging_handlers

View File

@@ -1,23 +1,195 @@
import fastapi
from dataclasses import dataclass
@dataclass
class Destination:
name: str
location: tuple
attractiveness: int
d = Destination()
def get_route() -> list[Destination]:
return {"route": "Hello World"}
endpoint = ("/get_route", get_route)
end
if __name__ == "__main__":
fastapi.run()
"""Main app for backend api"""
import logging
import time
from contextlib import asynccontextmanager
from fastapi import FastAPI, HTTPException, Query
from .logging_config import configure_logging
from .structs.landmark import Landmark, Toilets
from .structs.preferences import Preferences
from .structs.linked_landmarks import LinkedLandmarks
from .structs.trip import Trip
from .utils.landmarks_manager import LandmarkManager
from .utils.toilets_manager import ToiletsManager
from .optimization.optimizer import Optimizer
from .optimization.refiner import Refiner
from .cache import client as cache_client
logger = logging.getLogger(__name__)
manager = LandmarkManager()
optimizer = Optimizer()
refiner = Refiner(optimizer=optimizer)
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Function to run at the start of the app"""
logger.info("Setting up logging")
configure_logging()
yield
logger.info("Shutting down logging")
app = FastAPI(lifespan=lifespan)
@app.post("/trip/new")
def new_trip(preferences: Preferences,
start: tuple[float, float],
end: tuple[float, float] | None = None) -> Trip:
"""
Main function to call the optimizer.
Args:
preferences : the preferences specified by the user as the post body
start : the coordinates of the starting point
end : the coordinates of the finishing point
Returns:
(uuid) : The uuid of the first landmark in the optimized route
"""
if preferences is None:
raise HTTPException(status_code=406, detail="Preferences not provided or incomplete.")
if (preferences.shopping.score == 0 and
preferences.sightseeing.score == 0 and
preferences.nature.score == 0) :
raise HTTPException(status_code=406, detail="All preferences are 0.")
if start is None:
raise HTTPException(status_code=406, detail="Start coordinates not provided")
if not (-90 <= start[0] <= 90 or -180 <= start[1] <= 180):
raise HTTPException(status_code=422, detail="Start coordinates not in range")
if end is None:
end = start
logger.info("No end coordinates provided. Using start=end.")
start_landmark = Landmark(name='start',
type='start',
location=(start[0], start[1]),
osm_type='start',
osm_id=0,
attractiveness=0,
duration=0,
must_do=True,
n_tags = 0)
end_landmark = Landmark(name='finish',
type='finish',
location=(end[0], end[1]),
osm_type='end',
osm_id=0,
attractiveness=0,
duration=0,
must_do=True,
n_tags=0)
start_time = time.time()
# Generate the landmarks from the start location
landmarks, landmarks_short = manager.generate_landmarks_list(
center_coordinates = start,
preferences = preferences
)
# insert start and finish to the landmarks list
landmarks_short.insert(0, start_landmark)
landmarks_short.append(end_landmark)
t_generate_landmarks = time.time() - start_time
logger.info(f'Fetched {len(landmarks)} landmarks in \t: {round(t_generate_landmarks,3)} seconds')
start_time = time.time()
# First stage optimization
try:
base_tour = optimizer.solve_optimization(preferences.max_time_minute, landmarks_short)
except Exception as exc:
raise HTTPException(status_code=500, detail=f"Optimization failed: {str(exc)}") from exc
t_first_stage = time.time() - start_time
start_time = time.time()
# Second stage optimization
# TODO : only if necessary (not enough landmarks for ex.)
try :
refined_tour = refiner.refine_optimization(landmarks, base_tour,
preferences.max_time_minute,
preferences.detour_tolerance_minute)
except Exception as exc :
raise HTTPException(status_code=500, detail=f"An unexpected error occurred: {str(exc)}") from exc
t_second_stage = time.time() - start_time
logger.debug(f'First stage optimization\t: {round(t_first_stage,3)} seconds')
logger.debug(f'Second stage optimization\t: {round(t_second_stage,3)} seconds')
logger.info(f'Total computation time\t: {round(t_first_stage + t_second_stage,3)} seconds')
linked_tour = LinkedLandmarks(refined_tour)
# upon creation of the trip, persistence of both the trip and its landmarks is ensured.
trip = Trip.from_linked_landmarks(linked_tour, cache_client)
logger.info(f'Generated a trip of {trip.total_time} minutes with {len(refined_tour)} landmarks in {round(t_generate_landmarks + t_first_stage + t_second_stage,3)} seconds.')
return trip
#### For already existing trips/landmarks
@app.get("/trip/{trip_uuid}")
def get_trip(trip_uuid: str) -> Trip:
"""
Look-up the cache for a trip that has been previously generated using its identifier.
Args:
trip_uuid (str) : unique identifier for a trip.
Returns:
(Trip) : the corresponding trip.
"""
try:
trip = cache_client.get(f"trip_{trip_uuid}")
return trip
except KeyError as exc:
raise HTTPException(status_code=404, detail="Trip not found") from exc
@app.get("/landmark/{landmark_uuid}")
def get_landmark(landmark_uuid: str) -> Landmark:
"""
Returns a Landmark from its unique identifier.
Args:
landmark_uuid (str) : unique identifier for a Landmark.
Returns:
(Landmark) : the corresponding Landmark.
"""
try:
landmark = cache_client.get(f"landmark_{landmark_uuid}")
return landmark
except KeyError as exc:
raise HTTPException(status_code=404, detail="Landmark not found") from exc
@app.post("/toilets/new")
def get_toilets(location: tuple[float, float] = Query(...), radius: int = 500) -> list[Toilets] :
"""
Endpoint to find toilets within a specified radius from a given location.
This endpoint expects the `location` and `radius` as **query parameters**, not in the request body.
Args:
location (tuple[float, float]): The latitude and longitude of the location to search from.
radius (int, optional): The radius (in meters) within which to search for toilets. Defaults to 500 meters.
Returns:
list[Toilets]: A list of Toilets objects that meet the criteria.
"""
if location is None:
raise HTTPException(status_code=406, detail="Coordinates not provided or invalid")
if not (-90 <= location[0] <= 90 or -180 <= location[1] <= 180):
raise HTTPException(status_code=422, detail="Start coordinates not in range")
toilets_manager = ToiletsManager(location, radius)
try :
toilets_list = toilets_manager.generate_toilet_list()
return toilets_list
except KeyError as exc:
raise HTTPException(status_code=404, detail="No toilets found") from exc

View File

View File

@@ -0,0 +1,618 @@
"""Module responsible for sloving an MILP to find best tour around the given landmarks."""
import logging
from collections import defaultdict, deque
import yaml
import numpy as np
import pulp as pl
from ..structs.landmark import Landmark
from ..utils.get_time_distance import get_time
from ..constants import OPTIMIZER_PARAMETERS_PATH
# Silence the pupl logger
logging.getLogger('pulp').setLevel(level=logging.CRITICAL)
class Optimizer:
"""
Optimizes the balance between the efficiency of a tour and the inclusion of landmarks.
The `Optimizer` class is responsible for calculating the best possible detour adjustments
to a tour based on specific parameters such as detour time, walking speed, and the maximum
number of landmarks to visit. It helps refine a tour by determining whether adding additional
landmarks would significantly reduce the overall efficiency.
Responsibilities:
- Calculates the maximum detour time allowed for a given tour.
- Considers the detour factor, which accounts for real-world walking paths versus straight-line distance.
- Takes into account the average walking speed to estimate walking times.
- Limits the number of landmarks that can be added to the tour to prevent excessive detouring.
- Allows some overflow (overshoot) in the maximum detour time to accommodate for slight inefficiencies.
Attributes:
logger (logging.Logger): Logger for capturing relevant events and errors.
detour (int): The accepted maximum detour time in minutes.
detour_factor (float): The ratio between straight-line distance and actual walking distance in cities.
average_walking_speed (float): The average walking speed of an adult (in meters per second or kilometers per hour).
max_landmarks (int): The maximum number of landmarks to include in the tour.
overshoot (float): The overshoot allowance for exceeding the maximum detour time in a restrictive manner.
"""
logger = logging.getLogger(__name__)
detour: int = None # accepted max detour time (in minutes)
detour_factor: float # detour factor of straight line vs real distance in cities
average_walking_speed: float # average walking speed of adult
max_landmarks: int # max number of landmarks to visit
overshoot: float # overshoot to allow maxtime to overflow. Optimizer is a bit restrictive
def __init__(self) :
# load parameters from file
with OPTIMIZER_PARAMETERS_PATH.open('r') as f:
parameters = yaml.safe_load(f)
self.detour_factor = parameters['detour_factor']
self.average_walking_speed = parameters['average_walking_speed']
self.max_landmarks = parameters['max_landmarks']
self.overshoot = parameters['overshoot']
def init_ub_time(self, prob: pl.LpProblem, x: pl.LpVariable, L: int, landmarks: list[Landmark], max_time: int):
"""
Initialize the objective function and inequality constraints for the linear program.
This function sets up the objective to maximize the attractiveness of visiting landmarks,
while ensuring that the total time (including travel and visit duration) does not exceed
the maximum allowed time. It calculates the pairwise travel times between landmarks and
incorporates visit duration to form the inequality constraints.
The objective is to maximize sightseeing by selecting the most attractive landmarks within
the time limit.
Args:
prob (pl.LpProblem): The linear programming problem where constraints and the objective will be added.
x (pl.LpVariable): A decision variable representing whether a landmark is visited.
L (int): The number of landmarks.
landmarks (list[Landmark]): List of landmarks to visit.
max_time (int): Maximum allowable time for sightseeing, including travel and visit duration.
Returns:
None: Adds the objective function and constraints to the LP problem directly.
constraint coefficients, and the right-hand side of the inequality constraint.
"""
L = len(landmarks)
# Objective function coefficients. a*x1 + b*x2 + c*x3 + ...
c = np.zeros(L, dtype=np.int16)
# inequality matrix and vector
A_ub = np.zeros(L*L, dtype=np.int16)
b_ub = round(max_time*(1.1+max_time*self.overshoot))
for i, spot1 in enumerate(landmarks) :
c[i] = spot1.attractiveness
for j in range(i+1, L) :
if i !=j :
t = get_time(spot1.location, landmarks[j].location)
A_ub[i*L + j] = t + spot1.duration
A_ub[j*L + i] = t + landmarks[j].duration
# Expand 'c' to L*L for every decision variable and ad
c = np.tile(c, L)
# Now sort and modify A_ub for each row
if L > 22 :
for i in range(L):
# Get indices of the 4 smallest values in row i
row_values = A_ub[i*L:i*L+L]
closest_indices = np.argpartition(row_values, 22)[:22]
# Create a mask for non-closest landmarks
mask = np.ones(L, dtype=bool)
mask[closest_indices] = False
# Set non-closest landmarks to 32765
row_values[mask] = 32765
A_ub[i*L:i*L+L] = row_values
# Add the objective and the 1 distance constraint
prob += pl.lpSum([c[j] * x[j] for j in range(L*L)])
prob += (pl.lpSum([A_ub[j] * x[j] for j in range(L*L)]) <= b_ub)
def respect_number(self, prob: pl.LpProblem, x: pl.LpVariable, L: int, max_landmarks: int):
"""
Generate constraints to ensure each landmark is visited at most once and cap the total number of visited landmarks.
This function adds the following constraints to the linear program:
1. Each landmark is visited at most once by creating L-2 constraints (one for each landmark).
2. The total number of visited landmarks is capped by the specified maximum number (`max_landmarks`) plus 2.
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable indicating whether a landmark is visited.
L (int): The total number of landmarks.
max_landmarks (int): The maximum number of landmarks that can be visited.
Returns:
None: This function directly modifies the `prob` object by adding constraints.
"""
# L-2 constraints: each landmark is visited exactly once
for i in range(1, L-1):
prob += (pl.lpSum([x[L*i + j] for j in range(L)]) <= 1)
# 1 constraint: cap the total number of visits
prob += (pl.lpSum([1 * x[j] for j in range(L*L)]) <= max_landmarks+2)
def break_sym(self, prob: pl.LpProblem, x: pl.LpVariable, L: int):
"""
Generate constraints to prevent simultaneous travel between two landmarks
in both directions. This constraint ensures that, for any pair of landmarks,
travel from landmark i to landmark j (dij) and travel from landmark j to landmark i (dji)
cannot happen simultaneously.
This method adds constraints to break symmetry, specifically to prevent
cyclic paths with only two elements. It does not prevent cyclic paths involving more than two elements.
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable representing travel between landmarks.
L (int): The total number of landmarks.
Returns:
None: This function modifies the `prob` object by adding constraints in-place.
"""
upper_ind = np.triu_indices(L, 0, L) # Get the upper triangular indices
up_ind_x = upper_ind[0]
up_ind_y = upper_ind[1]
# Loop over the upper triangular indices, excluding diagonal elements
for i, up_ind in enumerate(up_ind_x):
if up_ind != up_ind_y[i]:
# Add (L*L-L)/2 constraints to break symmetry
prob += (x[up_ind*L + up_ind_y[i]] + x[up_ind_y[i]*L + up_ind] <= 1)
def init_eq_not_stay(self, prob: pl.LpProblem, x: pl.LpVariable, L: int):
"""
Generate constraints to prevent staying at the same position during travel.
Specifically, it removes travel from a landmark to itself (e.g., d11, d22, d33, etc.).
This function adds one equality constraint to the optimization problem that ensures
no decision variable corresponding to staying at the same landmark is included
in the solution. This helps in ensuring that the path does not include self-loops.
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable representing travel between landmarks.
L (int): The total number of landmarks.
Returns:
None: This function modifies the `prob` object by adding an equality constraint in-place.
"""
A_eq = np.zeros((L, L), dtype=np.int8)
# Set diagonal elements to 1 (to prevent staying in the same position)
np.fill_diagonal(A_eq, 1)
A_eq = A_eq.flatten()
# First equality constraint
prob += (pl.lpSum([A_eq[j] * x[j] for j in range(L*L)]) == 0)
def respect_start_finish(self, prob: pl.LpProblem, x: pl.LpVariable, L: int):
"""
Generate constraints to ensure that the optimization starts at the designated
start landmark and finishes at the goal landmark.
Specifically, this function adds three equality constraints:
1. Ensures that the path starts at the designated start landmark (row 0).
2. Ensures that the path finishes at the designated goal landmark (row 1).
3. Prevents any arrivals at the start landmark or departures from the goal landmark (row 2).
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable representing travel between landmarks.
L (int): The total number of landmarks.
Returns:
None: This function modifies the `prob` object by adding three equality constraints in-place.
"""
# Fill-in row 0.
A_eq = np.zeros((3,L*L), dtype=np.int8)
A_eq[0, :L] = np.ones(L, dtype=np.int8) # sets departures only for start (horizontal ones)
for k in range(L-1) :
if k != 0 :
# Fill-in row 1
A_eq[1, k*L+L-1] = 1 # sets arrivals only for finish (vertical ones)
# Fill-in row 1
A_eq[2, k*L] = 1
A_eq[2, L*(L-1):] = np.ones(L, dtype=np.int8) # prevents arrivals at start and departures from goal
b_eq= [1, 1, 0]
# Add the constraints to pulp
for i in range(3) :
prob += (pl.lpSum([A_eq[i][j] * x[j] for j in range(L*L)]) == b_eq[i])
def respect_order(self, prob: pl.LpProblem, x: pl.LpVariable, L: int):
"""
Generate constraints to tie the optimization problem together and prevent
stacked ones, although this does not fully prevent circles.
This function adds constraints to the optimization problem that prevent
simultaneous travel between landmarks in a way that would result in stacked ones.
However, it does not fully prevent circular paths.
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable representing travel between landmarks.
L (int): The total number of landmarks.
Returns:
None: This function modifies the `prob` object by adding L-2 equality constraints in-place.
"""
# FIXME: weird 0 artifact in the coefficients popping up
# Loop through rows 1 to L-2 to prevent stacked ones
for i in range(1, L-1):
# Add the constraint that sums across each "row" or "block" in the decision variables
row_sum = -pl.lpSum(x[i + j*L] for j in range(L)) + pl.lpSum(x[i*L:(i+1)*L])
prob += (row_sum == 0)
def respect_user_must(self, prob: pl.LpProblem, x: pl.LpVariable, L: int, landmarks: list[Landmark]) :
"""
Generate constraints to ensure that landmarks marked as 'must_do' are included in the optimization.
This function adds constraints to the optimization problem to ensure that landmarks marked as
'must_do' are included in the solution. It precomputes the constraints and adds them to the
problem accordingly.
Args:
prob (pl.LpProblem): The linear programming problem where constraints will be added.
x (pl.LpVariable): Decision variable representing travel between landmarks.
L (int): The total number of landmarks.
landmarks (list[Landmark]): List of landmarks, where some are marked as 'must_do'.
Returns:
None: This function modifies the `prob` object by adding equality constraints in-place.
"""
ones = np.ones(L, dtype=np.int8)
A_eq = np.zeros(L*L, dtype=np.int8)
for i, elem in enumerate(landmarks) :
if elem.must_do is True and i not in [0, L-1]:
A_eq[i*L:i*L+L] = ones
prob += (pl.lpSum([A_eq[j] * x[j] for j in range(L*L)]) == 1)
if elem.must_avoid is True and i not in [0, L-1]:
A_eq[i*L:i*L+L] = ones
prob += (pl.lpSum([A_eq[j] * x[j] for j in range(L*L)]) == 2)
def prevent_circle(self, prob: pl.LpProblem, x: pl.LpVariable, circle_vertices: list, L: int) :
"""
Prevent circular paths by adding constraints to the optimization.
This function ensures that circular paths in both directions (i.e., forward and reverse)
between landmarks are avoided in the optimization problem by adding the corresponding constraints.
Args:
prob (pl.LpProblem): The linear programming problem instance to which the constraints will be added.
x (pl.LpVariable): Decision variable representing the travel between landmarks in the problem.
circle_vertices (list): List of indices representing the landmarks that form a circular path.
L (int): The total number of landmarks.
Returns:
None: This function modifies the `prob` object by adding two equality constraints that
prevent circular paths in both directions for the specified circle vertices.
"""
l = np.zeros((2, L*L), dtype=np.int8)
for i, node in enumerate(circle_vertices[:-1]) :
next = circle_vertices[i+1]
l[0, node*L + next] = 1
l[1, next*L + node] = 1
s = circle_vertices[0]
g = circle_vertices[-1]
l[0, g*L + s] = 1
l[1, s*L + g] = 1
# Add the constraints
prob += (pl.lpSum([l[0][j] * x[j] for j in range(L*L)]) == 0)
prob += (pl.lpSum([l[1][j] * x[j] for j in range(L*L)]) == 0)
def is_connected(self, resx) :
"""
Determine the order of visits and detect any circular paths in the given configuration.
Args:
resx (list): List of edge weights.
Returns:
tuple[list[int], Optional[list[list[int]]]]: A tuple containing the visit order and a list of any detected circles.
"""
resx = np.round(resx).astype(np.int8) # round all elements and cast them to int
N = len(resx) # length of res
L = int(np.sqrt(N)) # number of landmarks. CAST INTO INT but should not be a problem because N = L**2 by def.
nonzeroind = np.nonzero(resx)[0] # the return is a little funny so I use the [0]
nonzero_tup = np.unravel_index(nonzeroind, (L,L))
ind_a = nonzero_tup[0]
ind_b = nonzero_tup[1]
# Extract all journeys
all_journeys_nodes = []
visited_nodes = set()
for node in ind_a:
if node not in visited_nodes:
journey_nodes = self.get_journey(node, ind_a, ind_b)
all_journeys_nodes.append(journey_nodes)
visited_nodes.update(journey_nodes)
for l in all_journeys_nodes :
if 0 in l :
all_journeys_nodes.remove(l)
break
if not all_journeys_nodes :
return None
return all_journeys_nodes
def get_journey(self, start, ind_a, ind_b):
"""
Trace the journey starting from a given node and follow the connections between landmarks.
This method constructs a graph from two lists of landmark connections, `ind_a` and `ind_b`,
where each element in `ind_a` is connected to the corresponding element in `ind_b`.
It then performs a depth-first search (DFS) starting from the `start` node to determine
the path (journey) by following the connections.
Args:
start (int): The starting node of the journey.
ind_a (list[int]): List of "from" nodes, representing the starting points of each connection.
ind_b (list[int]): List of "to" nodes, representing the endpoints of each connection.
Returns:
list[int]: A list of nodes representing the order of the journey, starting from the `start` node.
Example:
If `ind_a = [0, 1, 2]` and `ind_b = [1, 2, 3]`, starting from node 0, the journey would be `[0, 1, 2, 3]`.
"""
graph = defaultdict(list)
for a, b in zip(ind_a, ind_b):
graph[a].append(b)
journey_nodes = []
visited = set()
stack = deque([start])
while stack:
node = stack.pop()
if node not in visited:
visited.add(node)
journey_nodes.append(node)
for neighbor in graph[node]:
if neighbor not in visited:
stack.append(neighbor)
return journey_nodes
def get_order(self, resx):
"""
Determine the order of visits given the result of the optimization.
Args:
resx (list): List of edge weights.
Returns:
list[int]: A list containing the visit order.
"""
resx = np.round(resx).astype(np.uint8) # must contain only 0 and 1
N = len(resx) # length of res
L = int(np.sqrt(N)) # number of landmarks. CAST INTO INT but should not be a problem because N = L**2 by def.
nonzeroind = np.nonzero(resx)[0] # the return is a little funny so I use the [0]
nonzero_tup = np.unravel_index(nonzeroind, (L,L))
ind_a = nonzero_tup[0].tolist()
ind_b = nonzero_tup[1].tolist()
order = [0]
current = 0
used_indices = set() # Track visited index pairs
while True:
# Find index of the current node in ind_a
try:
i = ind_a.index(current)
except ValueError:
break # No more links, stop the search
if i in used_indices:
break # Prevent infinite loops
used_indices.add(i) # Mark this index as visited
next_node = ind_b[i] # Get the corresponding node in ind_b
order.append(next_node) # Add it to the path
# Switch roles, now look for next_node in ind_a
try:
current = next_node
except ValueError:
break # No further connections, end the path
return order
def link_list(self, order: list[int], landmarks: list[Landmark])->list[Landmark] :
"""
Compute the time to reach from each landmark to the next and create a list of landmarks with updated travel times.
Args:
order (list[int]): List of indices representing the order of landmarks to visit.
landmarks (list[Landmark]): List of all landmarks.
Returns:
list[Landmark]]: The updated linked list of landmarks with travel times
"""
L = []
j = 0
while j < len(order)-1 :
# get landmarks involved
elem = landmarks[order[j]]
next = landmarks[order[j+1]]
# get attributes
elem.time_to_reach_next = get_time(elem.location, next.location)
elem.must_do = True
elem.location = (round(elem.location[0], 5), round(elem.location[1], 5))
elem.next_uuid = next.uuid
L.append(elem)
j += 1
next.location = (round(next.location[0], 5), round(next.location[1], 5))
next.must_do = True
L.append(next)
return L
def warm_start(self, x: list[pl.LpVariable], L: int) :
for i in range(L*L) :
x[i].setInitialValue(0)
x[1].setInitialValue(1)
x[2*L-1].setInitialValue(1)
return x
def pre_processing(self, L: int, landmarks: list[Landmark], max_time: int, max_landmarks: int | None) :
"""
Preprocesses the optimization problem by setting up constraints and variables for the tour optimization.
This method initializes and prepares the linear programming problem to optimize a tour that includes landmarks,
while respecting various constraints such as time limits, the number of landmarks to visit, and user preferences.
The pre-processing step sets up the problem before solving it using a linear programming solver.
Responsibilities:
- Defines the optimization problem using linear programming (LP) with the objective to maximize the tour value.
- Creates binary decision variables for each potential transition between landmarks.
- Sets up inequality constraints to respect the maximum time available for the tour and the maximum number of landmarks.
- Implements equality constraints to ensure the tour respects the start and finish positions, avoids staying in the same place,
and adheres to a visit order.
- Forces inclusion or exclusion of specific landmarks based on user preferences.
Attributes:
prob (pl.LpProblem): The linear programming problem to be solved.
x (list): A list of binary variables representing transitions between landmarks.
L (int): The total number of landmarks considered in the optimization.
landmarks (list[Landmark]): The list of landmarks to be visited in the tour.
max_time (int): The maximum allowable time for the entire tour.
max_landmarks (int | None): The maximum number of landmarks to visit in the tour, or None if no limit is set.
Returns:
prob (pl.LpProblem): The linear programming problem setup for optimization.
x (list): The list of binary variables for transitions between landmarks in the tour.
"""
if max_landmarks is None :
max_landmarks = self.max_landmarks
# Initalize the optimization problem
prob = pl.LpProblem("OptimizationProblem", pl.LpMaximize)
# Define the problem
x_bounds = [(0, 1)]*L*L
x = [pl.LpVariable(f"x_{i}", lowBound=x_bounds[i][0], upBound=x_bounds[i][1], cat='Binary') for i in range(L*L)]
# Setup the inequality constraints
self.init_ub_time(prob, x, L, landmarks, max_time) # Adds the distances from each landmark to the other.
self.respect_number(prob, x, L, max_landmarks) # Respects max number of visits (no more possible stops than landmarks).
self.break_sym(prob, x, L) # Breaks the 'zig-zag' symmetry. Avoids d12 and d21 but not larger cirlces.
# Setup the equality constraints
self.init_eq_not_stay(prob, x, L) # Force solution not to stay in same place
self.respect_start_finish(prob, x, L) # Force start and finish positions
self.respect_order(prob, x, L) # Respect order of visit (only works when max_time is limiting factor)
self.respect_user_must(prob, x, L, landmarks) # Force to do/avoid landmarks set by user.
# return prob, self.warm_start(x, L)
return prob, x
def solve_optimization(self, max_time: int, landmarks: list[Landmark], max_landmarks: int = None) -> list[Landmark]:
"""
Main optimization pipeline to solve the landmark visiting problem.
This method sets up and solves a linear programming problem with constraints to find an optimal tour of landmarks,
considering user-defined must-visit landmarks, start and finish points, and ensuring no cycles are present.
Args:
max_time (int): Maximum time allowed for the tour in minutes.
landmarks (list[Landmark]): List of landmarks to visit.
max_landmarks (int): Maximum number of landmarks visited
Returns:
list[Landmark]: The optimized tour of landmarks with updated travel times, or None if no valid solution is found.
"""
# Setup the optimization proplem.
L = len(landmarks)
prob, x = self.pre_processing(L, landmarks, max_time, max_landmarks)
# Solve the problem and extract results.
prob.solve(pl.PULP_CBC_CMD(msg=False, gapRel=0.1, timeLimit=10, warmStart=False))
status = pl.LpStatus[prob.status]
solution = [pl.value(var) for var in x] # The values of the decision variables (will be 0 or 1)
self.logger.debug("First results are out. Looking out for circles and correcting.")
# Raise error if no solution is found. FIXME: for now this throws the internal server error
if status != 'Optimal' :
self.logger.error("The problem is overconstrained, no solution on first try.")
raise ArithmeticError("No solution could be found. Please try again with more time or different preferences.")
# If there is a solution, we're good to go, just check for connectiveness
circles = self.is_connected(solution)
i = 0
timeout = 40
while circles is not None :
i += 1
if i == timeout :
self.logger.error(f'Timeout: No solution found after {timeout} iterations.')
raise TimeoutError(f"Optimization took too long. No solution found after {timeout} iterations.")
for circle in circles :
self.prevent_circle(prob, x, circle, L)
# Solve the problem again
prob.solve(pl.PULP_CBC_CMD(msg=False))
solution = [pl.value(var) for var in x]
if pl.LpStatus[prob.status] != 'Optimal' :
self.logger.error("The problem is overconstrained, no solution after {i} cycles.")
raise ArithmeticError("No solution could be found. Please try again with more time or different preferences.")
circles = self.is_connected(solution)
if circles is None :
break
# Sort the landmarks in the order of the solution
order = self.get_order(solution)
tour = [landmarks[i] for i in order]
self.logger.debug(f"Re-optimized {i} times, objective value : {int(pl.value(prob.objective))}")
return tour

View File

@@ -0,0 +1,374 @@
"""Allows to refine the tour by adding more landmarks and making the path easier to follow."""
import logging
from math import pi
import yaml
from shapely import buffer, LineString, Point, Polygon, MultiPoint, concave_hull
from ..structs.landmark import Landmark
from ..utils.get_time_distance import get_time
from ..utils.take_most_important import take_most_important
from .optimizer import Optimizer
from ..constants import OPTIMIZER_PARAMETERS_PATH
class Refiner :
"""
Refines a tour by incorporating smaller landmarks along the path to enhance the experience.
This class is designed to adjust an existing tour by considering additional,
smaller points of interest (landmarks) that may require minor detours but
improve the overall quality of the tour. It balances the efficiency of travel
with the added value of visiting these landmarks.
"""
logger = logging.getLogger(__name__)
detour_factor: float # detour factor of straight line vs real distance in cities
detour_corridor_width: float # width of the corridor around the path
average_walking_speed: float # average walking speed of adult
max_landmarks_refiner: int # max number of landmarks to visit
optimizer: Optimizer # optimizer object
def __init__(self, optimizer: Optimizer) :
self.optimizer = optimizer
# load parameters from file
with OPTIMIZER_PARAMETERS_PATH.open('r') as f:
parameters = yaml.safe_load(f)
self.detour_factor = parameters['detour_factor']
self.detour_corridor_width = parameters['detour_corridor_width']
self.average_walking_speed = parameters['average_walking_speed']
self.max_landmarks_refiner = parameters['max_landmarks_refiner']
def create_corridor(self, landmarks: list[Landmark], width: float) :
"""
Create a corridor around the path connecting the landmarks.
Args:
landmarks (list[Landmark]) : the landmark path around which to create the corridor
width (float) : width of the corridor in meters.
Returns:
Geometry: a buffered geometry object representing the corridor around the path.
"""
corrected_width = (180*width)/(6371000*pi)
path = self.create_linestring(landmarks)
obj = buffer(path, corrected_width, join_style="mitre", cap_style="square", mitre_limit=2)
return obj
def create_linestring(self, tour: list[Landmark]) -> LineString :
"""
Create a `LineString` object from a tour.
Args:
tour (list[Landmark]): An ordered sequence of landmarks that represents the visiting order.
Returns:
LineString: A `LineString` object representing the path through the landmarks.
"""
points = []
for landmark in tour :
points.append(Point(landmark.location))
return LineString(points)
# Check if some coordinates are in area. Used for the corridor
def is_in_area(self, area: Polygon, coordinates) -> bool :
"""
Check if a given point is within a specified area.
Args:
area (Polygon): The polygon defining the area.
coordinates (tuple[float, float]): The coordinates of the point to check.
Returns:
bool: True if the point is within the area, otherwise False.
"""
point = Point(coordinates)
return point.within(area)
# Function to determine if two landmarks are close to each other
def is_close_to(self, location1: tuple[float], location2: tuple[float]):
"""
Determine if two locations are close to each other by rounding their coordinates to 3 decimal places.
Args:
location1 (tuple[float, float]): The coordinates of the first location.
location2 (tuple[float, float]): The coordinates of the second location.
Returns:
bool: True if the locations are within 0.001 degrees of each other, otherwise False.
"""
absx = abs(location1[0] - location2[0])
absy = abs(location1[1] - location2[1])
return absx < 0.001 and absy < 0.001
#return (round(location1[0], 3), round(location1[1], 3)) == (round(location2[0], 3), round(location2[1], 3))
def rearrange(self, tour: list[Landmark]) -> list[Landmark]:
"""
Rearrange landmarks to group nearby visits together.
This function reorders landmarks so that nearby landmarks are adjacent to each other in the list,
while keeping 'start' and 'finish' landmarks in their original positions.
Args:
tour (list[Landmark]): Ordered list of landmarks to be rearranged.
Returns:
list[Landmark]: The rearranged list of landmarks with grouped nearby visits.
"""
i = 1
while i < len(tour):
j = i+1
while j < len(tour):
if self.is_close_to(tour[i].location, tour[j].location) and tour[i].name not in ['start', 'finish'] and tour[j].name not in ['start', 'finish']:
# If they are not adjacent, move the j-th element to be adjacent to the i-th element
if j != i + 1:
tour.insert(i + 1, tour.pop(j))
break # Move to the next i-th element after rearrangement
j += 1
i += 1
return tour
def integrate_landmarks(self, sub_list: list[Landmark], main_list: list[Landmark]) :
"""
Inserts 'sub_list' of Landmarks inside the 'main_list' by leaving the ends untouched.
Args:
sub_list : the list of Landmarks to be inserted inside of the 'main_list'.
main_list : the original list with start and finish.
Returns:
the full list.
"""
sub_list.append(main_list[-1]) # add finish back
return main_list[:-1] + sub_list # create full set of possible landmarks
def find_shortest_path_through_all_landmarks(self, landmarks: list[Landmark]) -> tuple[list[Landmark], Polygon]:
"""
Find the shortest path through all landmarks using a nearest neighbor heuristic.
This function constructs a path that starts from the 'start' landmark, visits all other landmarks in the order
of their proximity, and ends at the 'finish' landmark. It returns both the ordered list of landmarks and a
polygon representing the path.
Args:
landmarks (list[Landmark]): list of all landmarks including 'start' and 'finish'.
Returns:
tuple[list[Landmark], Polygon]: A tuple where the first element is the list of landmarks in the order they
should be visited, and the second element is a `Polygon` representing
the path connecting all landmarks.
"""
# Step 1: Find 'start' and 'finish' landmarks
start_idx = next(i for i, lm in enumerate(landmarks) if lm.type == 'start')
finish_idx = next(i for i, lm in enumerate(landmarks) if lm.type == 'finish')
start_landmark = landmarks[start_idx]
finish_landmark = landmarks[finish_idx]
# Step 2: Create a list of unvisited landmarks excluding 'start' and 'finish'
unvisited_landmarks = [lm for i, lm in enumerate(landmarks) if i not in [start_idx, finish_idx]]
# Step 3: Initialize the path with the 'start' landmark
path = [start_landmark]
coordinates = [landmarks[start_idx].location]
current_landmark = start_landmark
# Step 4: Use nearest neighbor heuristic to visit all landmarks
while unvisited_landmarks:
nearest_landmark = min(unvisited_landmarks, key=lambda lm: get_time(current_landmark.location, lm.location))
path.append(nearest_landmark)
coordinates.append(nearest_landmark.location)
current_landmark = nearest_landmark
unvisited_landmarks.remove(nearest_landmark)
# Step 5: Finally add the 'finish' landmark to the path
path.append(finish_landmark)
coordinates.append(landmarks[finish_idx].location)
path_poly = Polygon(coordinates)
return path, path_poly
# Returns a list of minor landmarks around the planned path to enhance experience
def get_minor_landmarks(self, all_landmarks: list[Landmark], visited_landmarks: list[Landmark], width: float) -> list[Landmark] :
"""
Identify landmarks within a specified corridor that have not been visited yet.
This function creates a corridor around the path defined by visited landmarks and then finds landmarks that fall
within this corridor. It returns a list of these landmarks, excluding those already visited, sorted by their importance.
Args:
all_landmarks (list[Landmark]): list of all available landmarks.
visited_landmarks (list[Landmark]): list of landmarks that have already been visited.
width (float): Width of the corridor around the visited landmarks.
Returns:
list[Landmark]: list of important landmarks within the corridor that have not been visited yet.
"""
second_order_landmarks = []
visited_names = []
area = self.create_corridor(visited_landmarks, width)
for visited in visited_landmarks :
visited_names.append(visited.name)
for landmark in all_landmarks :
if self.is_in_area(area, landmark.location) and landmark.name not in visited_names:
second_order_landmarks.append(landmark)
return take_most_important(second_order_landmarks, int(self.max_landmarks_refiner*0.75))
# Try fix the shortest path using shapely
def fix_using_polygon(self, tour: list[Landmark])-> list[Landmark] :
"""
Improve the tour path using geometric methods to ensure it follows a more optimal shape.
This function creates a polygon from the given tour and attempts to refine it using a concave hull. It reorders
the landmarks to fit within this refined polygon and adjusts the tour to ensure the 'start' landmark is at the
beginning. It also checks if the final polygon is simple and rearranges the tour if necessary.
Args:
tour (list[Landmark]): list of landmarks representing the current tour path.
Returns:
list[Landmark]: Refined list of landmarks in the order of visit to produce a better tour path.
"""
coords = []
coords_dict = {}
for landmark in tour :
coords.append(landmark.location)
if landmark.name != 'finish' :
coords_dict[landmark.location] = landmark
tour_poly = Polygon(coords)
better_tour_poly = tour_poly.buffer(0)
try :
xs, ys = better_tour_poly.exterior.xy
if len(xs) != len(tour) :
better_tour_poly = concave_hull(MultiPoint(coords)) # Create concave hull with "core" of tour leaving out start and finish
xs, ys = better_tour_poly.exterior.xy
except Exception:
better_tour_poly = concave_hull(MultiPoint(coords)) # Create concave hull with "core" of tour leaving out start and finish
xs, ys = better_tour_poly.exterior.xy
"""
ERROR HERE :
Exception has occurred: AttributeError
'LineString' object has no attribute 'exterior'
"""
# reverse the xs and ys
xs.reverse()
ys.reverse()
better_tour = [] # list of ordered visit
name_index = {} # Maps the name of a landmark to its index in the concave polygon
# Loop through the polygon and generate the better (ordered) tour
for i,x in enumerate(xs[:-1]) :
y = ys[i]
better_tour.append(coords_dict[tuple((x,y))])
name_index[coords_dict[tuple((x,y))].name] = i
# Scroll the list to have start in front again
start_index = name_index['start']
better_tour = better_tour[start_index:] + better_tour[:start_index]
# Append the finish back and correct the time to reach
better_tour.append(tour[-1])
# Rearrange only if polygon still not simple
if not better_tour_poly.is_simple :
better_tour = self.rearrange(better_tour)
return better_tour
def refine_optimization(
self,
all_landmarks: list[Landmark],
base_tour: list[Landmark],
max_time: int,
detour: int
) -> list[Landmark]:
"""
This is the second stage of the optimization. It refines the initial tour path by considering additional minor landmarks and optimizes the path.
This method evaluates the need for further optimization based on the initial tour. If a detour is required
it adds minor landmarks around the initial predicted path and solves a new optimization problem to find a potentially better
tour. It then links the new tour and adjusts it using a nearest neighbor heuristic and polygon-based methods to
ensure a valid path. The final tour is chosen based on the shortest distance.
Args:
all_landmarks (list[Landmark]): The full list of landmarks available for the optimization.
base_tour (list[Landmark]): The initial tour path to be refined.
max_time (int): The maximum time available for the tour in minutes.
detour (int): The maximum detour time allowed for the tour in minutes.
Returns:
list[Landmark]: The refined list of landmarks representing the optimized tour path.
"""
# No need to refine if no detour is taken
# if detour == 0:
# return base_tour
minor_landmarks = self.get_minor_landmarks(all_landmarks, base_tour, self.detour_corridor_width)
self.logger.debug(f"Using {len(minor_landmarks)} minor landmarks around the predicted path")
# Full set of visitable landmarks.
full_set = self.integrate_landmarks(minor_landmarks, base_tour) # could probably be optimized with less overhead
# Generate a new tour with the optimizer.
new_tour = self.optimizer.solve_optimization(
max_time = max_time + detour,
landmarks = full_set,
max_landmarks = self.max_landmarks_refiner
)
# If unsuccessful optimization, use the base_tour.
if new_tour is None:
self.logger.warning("No solution found for the refined tour. Returning the initial tour.")
new_tour = base_tour
# If only one landmark, return it.
if len(new_tour) < 4 :
return new_tour
# Find shortest path using the nearest neighbor heuristic.
better_tour, better_poly = self.find_shortest_path_through_all_landmarks(new_tour)
# Fix the tour using Polygons if the path looks weird.
# Conditions : circular trip and invalid polygon.
if base_tour[0].location == base_tour[-1].location and not better_poly.is_valid :
better_tour = self.fix_using_polygon(better_tour)
return better_tour

View File

@@ -1,206 +0,0 @@
from scipy.optimize import linprog
import numpy as np
from scipy.linalg import block_diag
# Defines the landmark class (aka some place there is to visit)
class landmark :
def __init__(self, name: str, attractiveness: int, loc: tuple):
self.name = name
self.attractiveness = attractiveness
self.loc = loc
# Convert the result (edges from j to k like d_25 = edge between vertex 2 and vertex 5) into the list of indices corresponding to the landmarks
def untangle(res: list) :
N = len(res) # length of res
L = int(np.sqrt(N)) # number of landmarks. CAST INTO INT but should not be a problem because N = L**2 by def.
n_landmarks = res.sum() # number of visited landmarks
visit_order = []
cnt = 0
if n_landmarks % 2 == 1 : # if odd number of visited checkpoints
for i in range(L) :
for j in range(L) :
if res[i*L + j] == 1 : # if index is 1
cnt += 1 # increment counter
if cnt % 2 == 1 : # if counter odd
visit_order.append(i)
visit_order.append(j)
else : # if even number of ones
for i in range(L) :
for j in range(L) :
if res[i*L + j] == 1 : # if index is one
cnt += 1 # increment counter
if j % (L-1) == 0 : # if last node
visit_order.append(j) # append only the last index
return visit_order # return
if cnt % 2 == 1 :
visit_order.append(i)
visit_order.append(j)
return visit_order
# Just to print the result
def print_res(res: list, P) :
X = abs(res.x)
order = untangle(X)
# print("Optimal value:", -res.fun) # Minimization, so we negate to get the maximum
# print("Optimal point:", res.x)
# N = int(np.sqrt(len(X)))
# for i in range(N):
# print(X[i*N:i*N+N])
# print(order)
if (X.sum()+1)**2 == len(X) :
print('\nAll landmarks can be visited within max_steps, the following order is most likely not the fastest')
else :
print('Could not visit all the landmarks, the following order could be the fastest but not sure')
print("Order of visit :")
for i, elem in enumerate(landmarks) :
if i in order : print('- ' + elem.name)
steps = path_length(P, abs(res.x))
print("\nSteps walked : " + str(steps))
# Constraint to use only the upper triangular indices for travel
def break_sym(landmarks, A_eq, b_eq):
L = len(landmarks)
l = [0]*L*L
for i in range(L) :
for j in range(L) :
if i >= j :
l[j+i*L] = 1
A_eq = np.vstack((A_eq,l))
b_eq.append(0)
return A_eq, b_eq
# Constraint to respect max number of travels
def respect_number(landmarks, A_ub, b_ub):
h = []
for i in range(len(landmarks)) : h.append([1]*len(landmarks))
T = block_diag(*h)
return np.vstack((A_ub, T)), b_ub + [1]*len(landmarks)
# Constraint to tie the problem together and have a connected path
def respect_order(landmarks: list, A_eq, b_eq):
N = len(landmarks)
for i in range(N-1) : # Prevent stacked ones
if i == 0 :
continue
else :
l = [0]*N
l[i] = -1
l = l*N
for j in range(N) :
l[i*N + j] = 1
A_eq = np.vstack((A_eq,l))
b_eq.append(0)
return A_eq, b_eq
# Compute manhattan distance between 2 locations
def manhattan_distance(loc1: tuple, loc2: tuple):
x1, y1 = loc1
x2, y2 = loc2
return abs(x1 - x2) + abs(y1 - y2)
# Constraint to not stay in position
def init_eq_not_stay(landmarks):
L = len(landmarks)
l = [0]*L*L
for i in range(L) :
for j in range(L) :
if j == i :
l[j + i*L] = 1
#A_eq = np.array([np.array(xi) for xi in A_eq]) # Must convert A_eq into an np array
l = np.array(np.array(l))
return [l], [0]
# Initialize A and c. Compute the distances from all landmarks to each other and store attractiveness
# We want to maximize the sightseeing : max(c) st. A*x < b and A_eq*x = b_eq
def init_ub_dist(landmarks: list, max_steps: int):
# Objective function coefficients. a*x1 + b*x2 + c*x3 + ...
c = []
# Coefficients of inequality constraints (left-hand side)
A = []
for i, spot1 in enumerate(landmarks) :
dist_table = [0]*len(landmarks)
c.append(-spot1.attractiveness)
for j, spot2 in enumerate(landmarks) :
dist_table[j] = manhattan_distance(spot1.loc, spot2.loc)
A.append(dist_table)
c = c*len(landmarks)
A_ub = []
for line in A :
A_ub += line
return c, A_ub, [max_steps]
# Go through the landmarks and force the optimizer to use landmarks where attractiveness is set to -1
def respect_user_mustsee(landmarks: list, A_eq: list, b_eq: list) :
L = len(landmarks)
for i, elem in enumerate(landmarks) :
if elem.attractiveness == -1 :
l = [0]*L*L
if elem.name != "arrivée" :
for j in range(L) :
l[j +i*L] = 1
else : # This ensures we go to goal
for k in range(L-1) :
l[k*L+L-1] = 1
A_eq = np.vstack((A_eq,l))
b_eq.append(1)
return A_eq, b_eq
# Computes the path length given path matrix (dist_table) and a result
def path_length(P: list, resx: list) :
return np.dot(P, resx)
# Initialize all landmarks (+ start and goal). Order matters here
landmarks = []
landmarks.append(landmark("départ", -1, (0, 0)))
landmarks.append(landmark("concorde", -1, (5,5)))
landmarks.append(landmark("tour eiffel", 99, (1,1))) # PUT IN JSON
landmarks.append(landmark("arc de triomphe", 99, (2,3)))
landmarks.append(landmark("louvre", 70, (4,2)))
landmarks.append(landmark("montmartre", 20, (0,2)))
landmarks.append(landmark("arrivée", -1, (0, 0)))
# CONSTRAINT TO RESPECT MAX NUMBER OF STEPS
max_steps = 25
# SET CONSTRAINTS FOR INEQUALITY
c, A_ub, b_ub = init_ub_dist(landmarks, max_steps) # Add the distances from each landmark to the other
P = A_ub # store the paths for later. Needed to compute path length
A_ub, b_ub = respect_number(landmarks, A_ub, b_ub) # Respect max number of visits.
# SET CONSTRAINTS FOR EQUALITY
A_eq, b_eq = init_eq_not_stay(landmarks) # Force solution not to stay in same place
A_eq, b_eq = respect_user_mustsee(landmarks, A_eq, b_eq) # Check if there are user_defined must_see. Also takes care of start/goal
A_eq, b_eq = break_sym(landmarks, A_eq, b_eq) # break the symmetry. Only use the upper diagonal values
A_eq, b_eq = respect_order(landmarks, A_eq, b_eq) # Respect order of visit (only works when max_steps is limiting factor)
# Bounds for variables (x can only be 0 or 1)
x_bounds = [(0, 1)] * len(c)
# Solve linear programming problem
res = linprog(c, A_ub=A_ub, b_ub=b_ub, A_eq=A_eq, b_eq = b_eq, bounds=x_bounds, method='highs', integrality=3)
# Raise error if no solution is found
if not res.success :
raise ValueError("No solution has been found, please adapt your max steps")
# Print result
print_res(res, P)

View File

View File

@@ -0,0 +1,140 @@
"""Module defining the caching strategy for overpass requests."""
import os
import xml.etree.ElementTree as ET
import hashlib
from ..constants import OSM_CACHE_DIR
def get_cache_key(query: str) -> str:
"""
Generate a unique cache key for the query using a hash function.
This ensures that queries with different parameters are cached separately.
"""
return hashlib.md5(query.encode('utf-8')).hexdigest()
class CachingStrategyBase:
"""
Base class for implementing caching strategies.
This class defines the structure for a caching strategy with basic methods
that must be implemented by subclasses. Subclasses should define how to
retrieve, store, and close the cache.
"""
def get(self, key):
"""Retrieve the cached data associated with the provided key."""
raise NotImplementedError('Subclass should implement get')
def set(self, key, value):
"""Store data in the cache with the specified key."""
raise NotImplementedError('Subclass should implement set')
def close(self):
"""Clean up or close any resources used by the caching strategy."""
class XMLCache(CachingStrategyBase):
"""
A caching strategy that stores and retrieves data in XML format.
This class provides methods to cache data as XML files in a specified directory.
The directory is automatically suffixed with '_XML' to distinguish it from other
caching strategies. The data is stored and retrieved using XML serialization.
Args:
cache_dir (str): The base directory where XML cache files will be stored.
Defaults to 'OSM_CACHE_DIR' with a '_XML' suffix.
Methods:
get(key): Retrieve cached data from a XML file associated with the given key.
set(key, value): Store data in a XML file with the specified key.
"""
def __init__(self, cache_dir=OSM_CACHE_DIR):
# Add the class name as a suffix to the directory
self._cache_dir = f'{cache_dir}_XML'
if not os.path.exists(self._cache_dir):
os.makedirs(self._cache_dir)
def _filename(self, key):
return os.path.join(self._cache_dir, f'{key}.xml')
def get(self, key):
"""Retrieve XML data from the cache and parse it as an ElementTree."""
filename = self._filename(key)
if os.path.exists(filename):
try:
# Parse and return the cached XML data
tree = ET.parse(filename)
return tree.getroot() # Return the root element of the parsed XML
except ET.ParseError:
# print(f"Error parsing cached XML file: {filename}")
return None
return None
def set(self, key, value):
"""Save the XML data as an ElementTree to the cache."""
filename = self._filename(key)
tree = ET.ElementTree(value) # value is expected to be an ElementTree root element
try:
# Write the XML data to a file
with open(filename, 'wb') as file:
tree.write(file, encoding='utf-8', xml_declaration=True)
except IOError as e:
raise IOError(f"Error writing to cache file: {filename} - {e}") from e
class CachingStrategy:
"""
A class to manage different caching strategies.
This class provides an interface to switch between different caching strategies
(e.g., XMLCache, JSONCache) dynamically. It allows caching data in different formats,
depending on the strategy being used. By default, it uses the XMLCache strategy.
Attributes:
__strategy (CachingStrategyBase): The currently active caching strategy.
__strategies (dict): A mapping between strategy names (as strings) and their corresponding
classes, allowing dynamic selection of caching strategies.
"""
__strategy = XMLCache() # Default caching strategy
__strategies = {
'XML': XMLCache,
}
@classmethod
def use(cls, strategy_name='XML', **kwargs):
"""
Set the caching strategy based on the strategy_name provided.
Args:
strategy_name (str): The name of the caching strategy (e.g., 'XML').
**kwargs: Additional keyword arguments to pass when initializing the strategy.
"""
# If a previous strategy exists, close it
if cls.__strategy:
cls.__strategy.close()
# Retrieve the strategy class based on the strategy name
strategy_class = cls.__strategies.get(strategy_name)
if not strategy_class:
raise ValueError(f"Unknown caching strategy: {strategy_name}")
# Instantiate the new strategy with the provided arguments
cls.__strategy = strategy_class(**kwargs)
return cls.__strategy
@classmethod
def get(cls, key):
"""Get data from the current strategy's cache."""
if not cls.__strategy:
raise RuntimeError("Caching strategy has not been set.")
return cls.__strategy.get(key)
@classmethod
def set(cls, key, value):
"""Set data in the current strategy's cache."""
if not cls.__strategy:
raise RuntimeError("Caching strategy has not been set.")
cls.__strategy.set(key, value)

View File

@@ -0,0 +1,171 @@
"""Module allowing connexion to overpass api and fectch data from OSM."""
from typing import Literal, List
import urllib
import logging
import xml.etree.ElementTree as ET
from .caching_strategy import get_cache_key, CachingStrategy
from ..constants import OSM_CACHE_DIR
logger = logging.getLogger('Overpass')
osm_types = List[Literal['way', 'node', 'relation']]
class Overpass :
"""
Overpass class to manage the query building and sending to overpass api.
The caching strategy is a part of this class and initialized upon creation of the Overpass object.
"""
def __init__(self, caching_strategy: str = 'XML', cache_dir: str = OSM_CACHE_DIR) :
"""
Initialize the Overpass instance with the url, headers and caching strategy.
"""
self.overpass_url = "https://overpass-api.de/api/interpreter"
self.headers = {'User-Agent': 'Mozilla/5.0 (compatible; OverpassQuery/1.0; +http://example.com)',}
self.caching_strategy = CachingStrategy.use(caching_strategy, cache_dir=cache_dir)
@classmethod
def build_query(self, area: tuple, osm_types: osm_types,
selector: str, conditions=[], out='center') -> str:
"""
Constructs a query string for the Overpass API to retrieve OpenStreetMap (OSM) data.
Args:
area (tuple): A tuple representing the geographical search area, typically in the format
(radius, latitude, longitude). The first element is a string like "around:2000"
specifying the search radius, and the second and third elements represent
the latitude and longitude as floats or strings.
osm_types (list[str]): A list of OSM element types to search for. Must be one or more of
'Way', 'Node', or 'Relation'.
selector (str): The key or tag to filter the OSM elements (e.g., 'amenity', 'highway', etc.).
conditions (list, optional): A list of conditions to apply as additional filters for the
selected OSM elements. The conditions should be written in
the Overpass QL format, and they are combined with '&&' if
multiple are provided. Defaults to an empty list.
out (str, optional): Specifies the output type, such as 'center', 'body', or 'tags'.
Defaults to 'center'.
Returns:
str: The constructed Overpass QL query string.
Notes:
- If no conditions are provided, the query will just use the `selector` to filter the OSM
elements without additional constraints.
- The search area must always formatted as "(radius, lat, lon)".
"""
if not isinstance(conditions, list) :
conditions = [conditions]
if not isinstance(osm_types, list) :
osm_types = [osm_types]
query = '('
# Round the radius to nearest 50 and coordinates to generate less queries
if area[0] > 500 :
search_radius = round(area[0] / 50) * 50
loc = tuple((round(area[1], 2), round(area[2], 2)))
else :
search_radius = round(area[0] / 25) * 25
loc = tuple((round(area[1], 3), round(area[2], 3)))
search_area = f"(around:{search_radius}, {str(loc[0])}, {str(loc[1])})"
if conditions :
conditions = '(if: ' + ' && '.join(conditions) + ')'
else :
conditions = ''
for elem in osm_types :
query += elem + '[' + selector + ']' + conditions + search_area + ';'
query += ');' + f'out {out};'
return query
def send_query(self, query: str) -> ET:
"""
Sends the Overpass QL query to the Overpass API and returns the parsed JSON response.
Args:
query (str): The Overpass QL query to be sent to the Overpass API.
Returns:
dict: The parsed JSON response from the Overpass API, or None if the request fails.
"""
# Generate a cache key for the current query
cache_key = get_cache_key(query)
# Try to fetch the result from the cache
cached_response = self.caching_strategy.get(cache_key)
if cached_response is not None :
logger.debug("Cache hit.")
return cached_response
# Prepare the data to be sent as POST request, encoded as bytes
data = urllib.parse.urlencode({'data': query}).encode('utf-8')
try:
# Create a Request object with the specified URL, data, and headers
request = urllib.request.Request(self.overpass_url, data=data, headers=self.headers)
# Send the request and read the response
with urllib.request.urlopen(request) as response:
# Read and decode the response
response_data = response.read().decode('utf-8')
root = ET.fromstring(response_data)
# Cache the response data as an ElementTree root
self.caching_strategy.set(cache_key, root)
logger.debug("Response data added to cache.")
return root
except urllib.error.URLError as e:
raise ConnectionError(f"Error connecting to Overpass API: {e}") from e
def get_base_info(elem: ET.Element, osm_type: osm_types, with_name=False) :
"""
Extracts base information (coordinates, OSM ID, and optionally a name) from an OSM element.
This function retrieves the latitude and longitude coordinates, OSM ID, and optionally the name
of a given OpenStreetMap (OSM) element. It handles different OSM types (e.g., 'node', 'way') by
extracting coordinates either directly or from a center tag, depending on the element type.
Args:
elem (ET.Element): The XML element representing the OSM entity.
osm_type (str): The type of the OSM entity (e.g., 'node', 'way'). If 'node', the coordinates
are extracted directly from the element; otherwise, from the 'center' tag.
with_name (bool): Whether to extract and return the name of the element. If True, it attempts
to find the 'name' tag within the element and return its value. Defaults to False.
Returns:
tuple: A tuple containing:
- osm_id (str): The OSM ID of the element.
- coords (tuple): A tuple of (latitude, longitude) coordinates.
- name (str, optional): The name of the element if `with_name` is True; otherwise, not included.
"""
# 1. extract coordinates
if osm_type != 'node' :
center = elem.find('center')
lat = float(center.get('lat'))
lon = float(center.get('lon'))
else :
lat = float(elem.get('lat'))
lon = float(elem.get('lon'))
coords = tuple((lat, lon))
# 2. Extract OSM id
osm_id = elem.get('id')
# 3. Extract name if specified and return
if with_name :
name = elem.find("tag[@k='name']").get('v') if elem.find("tag[@k='name']") is not None else None
return osm_id, coords, name
else :
return osm_id, coords

View File

@@ -0,0 +1,92 @@
# Tags were picked mostly arbitrarily, based on the OSM wiki and the OSM tags page.
# See https://taginfo.openstreetmap.org for more inspiration.
nature:
leisure: park
geological: ''
natural:
- geyser
- hot_spring
- arch
- volcano
- stone
tourism:
- alpine_hut
- viewpoint
- zoo
- resort
- picnic_site
water:
- pond
- lake
- river
- basin
- stream
- lagoon
- rapids
waterway:
- waterfall
- river
- canal
- dam
- dock
- boatyard
shopping:
shop:
- department_store
- mall
sightseeing:
tourism:
- museum
- attraction
- gallery
- artwork
- aquarium
historic: ''
amenity:
- planetarium
- place_of_worship
- fountain
- townhall
water: reflecting_pool
bridge:
- aqueduct
- viaduct
- boardwalk
- cantilever
- abandoned
building: cathedral
# unused sightseeing/buildings:
# - church
# - chapel
# - mosque
# - synagogue
# - ruins
# - temple
# - government
# - cathedral
# - castle
# - museum
museums:
tourism:
- museum
- aquarium
# to be used later on
restauration:
shop:
- coffee
- bakery
- restaurant
- pastry
amenity:
- restaurant
- cafe
- ice_cream
- food_court
- biergarten

View File

@@ -0,0 +1,12 @@
city_bbox_side: 7500 #m
radius_close_to: 50
church_coeff: 0.55
nature_coeff: 1.4
overall_coeff: 10
tag_exponent: 1.15
image_bonus: 1.1
viewpoint_bonus: 5
wikipedia_bonus: 1.25
name_bonus: 3
N_important: 40
pay_bonus: -1

View File

@@ -0,0 +1,6 @@
detour_factor: 1.4
detour_corridor_width: 300
average_walking_speed: 4.8
max_landmarks: 10
max_landmarks_refiner: 20
overshoot: 0.0016

View File

View File

@@ -0,0 +1,151 @@
"""Definition of the Landmark class to handle visitable objects across the world."""
from typing import Optional, Literal
from uuid import uuid4, UUID
from pydantic import BaseModel, Field
# Output to frontend
class Landmark(BaseModel) :
"""
A class representing a landmark or point of interest (POI) in the context of a trip.
The Landmark class is used to model visitable locations, such as tourist attractions,
natural sites, shopping locations, and start/end points in travel itineraries. It
holds information about the landmark's attributes and supports comparisons and
calculations, such as distance between landmarks.
Attributes:
name (str): The name of the landmark.
type (Literal): The type of the landmark, which can be one of ['sightseeing', 'nature',
'shopping', 'start', 'finish'].
location (tuple): A tuple representing the (latitude, longitude) of the landmark.
osm_type (str): The OpenStreetMap (OSM) type of the landmark.
osm_id (int): The OpenStreetMap (OSM) ID of the landmark.
attractiveness (int): A score representing the attractiveness of the landmark.
n_tags (int): The number of tags associated with the landmark.
image_url (Optional[str]): A URL to an image of the landmark.
website_url (Optional[str]): A URL to the landmark's official website.
description (Optional[str]): A text description of the landmark.
duration (Optional[int]): The estimated time to visit the landmark (in minutes).
name_en (Optional[str]): The English name of the landmark.
uuid (UUID): A unique identifier for the landmark, generated by default using uuid4.
must_do (Optional[bool]): Whether the landmark is a "must-do" attraction.
must_avoid (Optional[bool]): Whether the landmark should be avoided.
is_secondary (Optional[bool]): Whether the landmark is secondary or less important.
time_to_reach_next (Optional[int]): Estimated time (in minutes) to reach the next landmark.
next_uuid (Optional[UUID]): UUID of the next landmark in sequence (if applicable).
"""
# Properties of the landmark
name : str
type: Literal['sightseeing', 'nature', 'shopping', 'start', 'finish']
location : tuple
osm_type : str
osm_id : int
attractiveness : int
n_tags : int
# Optional properties to gather more information.
image_url : Optional[str] = None
website_url : Optional[str] = None
wiki_url : Optional[str] = None
description : Optional[str] = None # TODO future
duration : Optional[int] = 5
name_en : Optional[str] = None
# Unique ID of a given landmark
uuid: UUID = Field(default_factory=uuid4)
# Additional properties depending on specific tour
must_do : Optional[bool] = False
must_avoid : Optional[bool] = False
is_secondary : Optional[bool] = False
time_to_reach_next : Optional[int] = 0
next_uuid : Optional[UUID] = None
# More properties to define the score
is_viewpoint : Optional[bool] = False
is_place_of_worship : Optional[bool] = False
def __str__(self) -> str:
"""
String representation of the Landmark object.
Returns:
str: A formatted string with the landmark's type, name, location, attractiveness score,
time to the next landmark (if available), and whether the landmark is secondary.
"""
t_to_next_str = f", time_to_next={self.time_to_reach_next}" if self.time_to_reach_next else ""
is_secondary_str = ", secondary" if self.is_secondary else ""
type_str = '(' + self.type + ')'
return (f'Landmark{type_str}: [{self.name} @{self.location}, '
f'score={self.attractiveness}{t_to_next_str}{is_secondary_str}]')
def distance(self, value: 'Landmark') -> float:
"""
Calculates the squared distance between this landmark and another.
Args:
value (Landmark): Another Landmark object to calculate the distance to.
Returns:
float: The squared Euclidean distance between the two landmarks.
"""
return (self.location[0] - value.location[0])**2 + (self.location[1] - value.location[1])**2
def __hash__(self) -> int:
"""
Generates a hash for the Landmark based on its name.
Returns:
int: The hash of the landmark.
"""
return hash(self.name)
def __eq__(self, value: 'Landmark') -> bool:
"""
Checks equality between two Landmark objects based on UUID, OSM ID, and name.
Args:
value (Landmark): Another Landmark object to compare.
Returns:
bool: True if the landmarks are equal, False otherwise.
"""
# eq and hash must be consistent
# in particular, if two objects are equal, their hash must be equal
# uuid and osm_id are just shortcuts to avoid comparing all the properties
# if they are equal, we know that the name is also equal and in turn the hash is equal
return (self.uuid == value.uuid or
self.osm_id == value.osm_id or
(self.name == value.name and self.distance(value) < 0.001))
class Toilets(BaseModel) :
"""
Model for toilets. When false/empty the information is either false either not known.
"""
location : tuple
wheelchair : Optional[bool] = False
changing_table : Optional[bool] = False
fee : Optional[bool] = False
opening_hours : Optional[str] = ""
def __str__(self) -> str:
"""
String representation of the Toilets object.
Returns:
str: A formatted string with the toilets location.
"""
return f'Toilets @{self.location}'
class Config:
"""
This allows us to easily convert the model to and from dictionaries
"""
from_attributes = True

View File

@@ -0,0 +1,78 @@
"""Linked and ordered list of Landmarks that represents the visiting order."""
from .landmark import Landmark
from ..utils.get_time_distance import get_time
class LinkedLandmarks:
"""
A list of landmarks that are linked together, e.g. in a route.
Each landmark serves as a node in the linked list, but since we expect
these to be consumed through the rest API, a pythonic reference to the next
landmark is not well suited. Instead we use the uuid of the next landmark
to reference the next landmark in the list. This is not very efficient,
but appropriate for the expected use case
("short" trips with onyl few landmarks).
"""
_landmarks = list[Landmark]
total_time: int = 0
def __init__(self, data: list[Landmark] = None) -> None:
"""
Initialize a new LinkedLandmarks object. This expects an ORDERED list of landmarks,
where the first landmark is the starting point and the last landmark is the end point.
Args:
data (list[Landmark], optional): The list of landmarks that are linked together.
Defaults to None.
"""
self._landmarks = data if data else []
self._link_landmarks()
def _link_landmarks(self) -> None:
"""
Create the links between the landmarks in the list by setting their
.next_uuid and the .time_to_next attributes.
"""
# Mark secondary landmarks as such
self.update_secondary_landmarks()
for i, landmark in enumerate(self._landmarks[:-1]):
landmark.next_uuid = self._landmarks[i + 1].uuid
time_to_next = get_time(landmark.location, self._landmarks[i + 1].location)
landmark.time_to_reach_next = time_to_next
self.total_time += time_to_next
self.total_time += landmark.duration
self._landmarks[-1].next_uuid = None
self._landmarks[-1].time_to_reach_next = 0
def update_secondary_landmarks(self) -> None:
"""
Mark landmarks with lower importance as secondary.
"""
# Extract the attractiveness scores and sort them in descending order
scores = sorted([landmark.attractiveness for landmark in self._landmarks], reverse=True)
# Determine the 10th highest score
if len(scores) >= 10:
threshold_score = scores[9]
else:
# If there are fewer than 10 landmarks, use the lowest score as the threshold
threshold_score = min(scores) if scores else 0
# Update 'is_secondary' for landmarks with attractiveness below the threshold score
for landmark in self._landmarks:
if (landmark.attractiveness < threshold_score and landmark.type not in ["start", "finish"]):
landmark.is_secondary = True
def __getitem__(self, index: int) -> Landmark:
return self._landmarks[index]
def __str__(self) -> str:
return f"LinkedLandmarks [{' ->'.join([str(landmark) for landmark in self._landmarks])}]"

View File

@@ -0,0 +1,34 @@
"""Defines the Preferences used as input for trip generation."""
from typing import Optional, Literal
from pydantic import BaseModel
class Preference(BaseModel) :
"""
Type of preference.
Attributes:
type: what kind of landmark type.
score: how important that type is.
"""
type: Literal['sightseeing', 'nature', 'shopping', 'start', 'finish']
score: int # score could be from 1 to 5
# Input for optimization
class Preferences(BaseModel) :
""""
Full collection of preferences needed to generate a personalized trip.
"""
# Sightseeing / History & Culture (Musées, bâtiments historiques, opéras, églises)
sightseeing : Preference
# Nature (parcs, jardins, rivières, plages)
nature: Preference
# Shopping (diriger plutôt vers des zones / rues commerçantes)
shopping : Preference
max_time_minute: Optional[int] = 3*60
detour_tolerance_minute: Optional[int] = 0

View File

@@ -0,0 +1,48 @@
"""Definition of the Trip class."""
from uuid import uuid4, UUID
from pydantic import BaseModel, Field
from pymemcache.client.base import Client
from .linked_landmarks import LinkedLandmarks
class Trip(BaseModel):
""""
A Trip represents the final guided tour that can be passed to frontend.
Attributes:
uuid: unique identifier for this particular trip.
total_time: duration of the trip (in minutes).
first_landmark_uuid: unique identifier of the first Landmark to visit.
Methods:
from_linked_landmarks: create a Trip from LinkedLandmarks object.
"""
uuid: UUID = Field(default_factory=uuid4)
total_time: int
first_landmark_uuid: UUID
@classmethod
def from_linked_landmarks(cls, landmarks: LinkedLandmarks, cache_client: Client) -> "Trip":
"""
Initialize a new Trip object and ensure it is stored in the cache.
"""
trip = Trip(
total_time = landmarks.total_time,
first_landmark_uuid = landmarks[0].uuid
)
# Store the trip in the cache
cache_client.set(f"trip_{trip.uuid}", trip)
# Make sure to await the result (noreply=False).
# Otherwise the cache might not be inplace when the trip is actually requested.
cache_client.set_many({f"landmark_{landmark.uuid}": landmark for landmark in landmarks},
expire=3600, noreply=False)
# is equivalent to:
# for landmark in landmarks:
# cache_client.set(f"landmark_{landmark.uuid}", landmark, expire=3600)
return trip

View File

View File

@@ -0,0 +1,62 @@
"""Collection of tests to ensure correct handling of invalid input."""
from fastapi.testclient import TestClient
import pytest
from ..main import app
@pytest.fixture(scope="module")
def invalid_client():
"""Client used to call the app."""
return TestClient(app)
@pytest.mark.parametrize(
"start,preferences,status_code",
[
# Invalid case: no preferences at all.
([48.8566, 2.3522], {}, 422),
# Invalid cases: incomplete preferences.
([48.084588, 7.280405], {"sightseeing": {"type": "nature", "score": 5}, # no shopping
"nature": {"type": "nature", "score": 5},
}, 422),
([48.084588, 7.280405], {"sightseeing": {"type": "nature", "score": 5}, # no nature
"shopping": {"type": "shopping", "score": 5},
}, 422),
([48.084588, 7.280405], {"nature": {"type": "nature", "score": 5}, # no sightseeing
"shopping": {"type": "shopping", "score": 5},
}, 422),
# Invalid cases: unexisting coords
([91, 181], {"sightseeing": {"type": "nature", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
}, 422),
([-91, 181], {"sightseeing": {"type": "nature", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
}, 422),
([91, -181], {"sightseeing": {"type": "nature", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
}, 422),
([-91, -181], {"sightseeing": {"type": "nature", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
}, 422),
]
)
def test_input(invalid_client, start, preferences, status_code): # pylint: disable=redefined-outer-name
"""
Test new trip creation with different sets of preferences and locations.
"""
response = invalid_client.post(
"/trip/new",
json={
"preferences": preferences,
"start": start
}
)
assert response.status_code == status_code

View File

@@ -0,0 +1,344 @@
"""Collection of tests to ensure correct implementation and track progress. """
import time
from fastapi.testclient import TestClient
import pytest
from .test_utils import load_trip_landmarks, log_trip_details
from ..main import app
@pytest.fixture(scope="module")
def client():
"""Client used to call the app."""
return TestClient(app)
def test_turckheim(client, request): # pylint: disable=redefined-outer-name
"""
Test n°1 : Custom test in Turckheim to ensure small villages are also supported.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 20
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [48.084588, 7.280405]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert isinstance(landmarks, list) # check that the return type is a list
assert len(landmarks) > 2 # check that there is something to visit
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
# assert 2!= 3
def test_bellecour(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Lyon centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 120
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [45.7576485, 4.8330241]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_cologne(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Lyon centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 240
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [50.942352665, 6.957777972392]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_strasbourg(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Lyon centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 180
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [48.5846589226, 7.74078715721]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_zurich(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Lyon centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 180
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [47.377884227, 8.5395114066]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_paris(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in Paris (les Halles) centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 300
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [48.86248803298562, 2.346451131285925]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_new_york(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°2 : Custom test in New York (les Halles) centre to ensure proper decision making in crowded area.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 600
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 5},
"nature": {"type": "nature", "score": 5},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [40.72592726802, -73.9920434795]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
# for elem in landmarks :
# print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"
def test_shopping(client, request) : # pylint: disable=redefined-outer-name
"""
Test n°3 : Custom test in Lyon centre to ensure shopping clusters are found.
Args:
client:
request:
"""
start_time = time.time() # Start timer
duration_minutes = 240
response = client.post(
"/trip/new",
json={
"preferences": {"sightseeing": {"type": "sightseeing", "score": 0},
"nature": {"type": "nature", "score": 0},
"shopping": {"type": "shopping", "score": 5},
"max_time_minute": duration_minutes,
"detour_tolerance_minute": 0},
"start": [45.7576485, 4.8330241]
}
)
result = response.json()
landmarks = load_trip_landmarks(client, result['first_landmark_uuid'])
# Get computation time
comp_time = time.time() - start_time
# Add details to report
log_trip_details(request, landmarks, result['total_time'], duration_minutes)
for elem in landmarks :
print(elem)
# checks :
assert response.status_code == 200 # check for successful planning
assert comp_time < 30, f"Computation time exceeded 30 seconds: {comp_time:.2f} seconds"
assert duration_minutes*0.8 < result['total_time'], f"Trip too short: {result['total_time']} instead of {duration_minutes}"
assert duration_minutes*1.2 > result['total_time'], f"Trip too long: {result['total_time']} instead of {duration_minutes}"

View File

@@ -0,0 +1,101 @@
"""Collection of tests to ensure correct implementation and track progress. """
from fastapi.testclient import TestClient
import pytest
from ..structs.landmark import Toilets
from ..main import app
@pytest.fixture(scope="module")
def client():
"""Client used to call the app."""
return TestClient(app)
@pytest.mark.parametrize(
"location,radius,status_code",
[
({}, None, 422), # Invalid case: no location at all.
([443], None, 422), # Invalid cases: invalid location.
([443, 433], None, 422), # Invalid cases: invalid location.
]
)
def test_invalid_input(client, location, radius, status_code): # pylint: disable=redefined-outer-name
"""
Test n°1 : Verify handling of invalid input.
Args:
client:
request:
"""
response = client.post(
"/toilets/new",
params={
"location": location,
"radius": radius
}
)
# checks :
assert response.status_code == status_code
@pytest.mark.parametrize(
"location,status_code",
[
([48.2270, 7.4370], 200), # Orschwiller.
([10.2012, 10.123], 200), # Nigerian desert.
([63.989, -19.677], 200), # Hekla volcano, Iceland
]
)
def test_no_toilets(client, location, status_code): # pylint: disable=redefined-outer-name
"""
Test n°3 : Verify the code finds some toilets in big cities.
Args:
client:
request:
"""
response = client.post(
"/toilets/new",
params={
"location": location
}
)
toilets_list = [Toilets.model_validate(toilet) for toilet in response.json()]
# checks :
assert response.status_code == status_code # check for successful planning
assert isinstance(toilets_list, list) # check that the return type is a list
@pytest.mark.parametrize(
"location,status_code",
[
([45.7576485, 4.8330241], 200), # Lyon, Bellecour.
([-6.913795, 107.60278], 200), # Bandung, train station
([-22.970140, -43.18181], 200), # Rio de Janeiro, Copacabana
]
)
def test_toilets(client, location, status_code): # pylint: disable=redefined-outer-name
"""
Test n°3 : Verify the code finds some toilets in big cities.
Args:
client:
request:
"""
response = client.post(
"/toilets/new",
params={
"location": location,
"radius" : 600
}
)
toilets_list = [Toilets.model_validate(toilet) for toilet in response.json()]
# checks :
assert response.status_code == status_code # check for successful planning
assert isinstance(toilets_list, list) # check that the return type is a list
assert len(toilets_list) > 0

View File

@@ -0,0 +1,94 @@
"""Helper methods for testing."""
import logging
from fastapi import HTTPException
from pydantic import ValidationError
from ..structs.landmark import Landmark
from ..cache import client as cache_client
def landmarks_to_osmid(landmarks: list[Landmark]) -> list[int] :
"""
Convert the list of landmarks into a list containing their osm ids for quick landmark checking.
Args :
landmarks (list): the list of landmarks
Returns :
ids (list) : the list of corresponding OSM ids
"""
ids = []
for landmark in landmarks :
ids.append(landmark.osm_id)
return ids
def fetch_landmark(landmark_uuid: str):
"""
Fetch landmark data from the cache based on the landmark UUID.
Args:
landmark_uuid (str): The UUID of the landmark.
Returns:
dict: Landmark data fetched from the cache or raises an HTTP exception.
"""
logger = logging.getLogger(__name__)
# Try to fetch the landmark data from the cache
try:
landmark = cache_client.get(f'landmark_{landmark_uuid}')
if not landmark :
logger.warning(f'Cache miss for landmark UUID: {landmark_uuid}')
raise HTTPException(status_code=404, detail=f'Landmark with UUID {landmark_uuid} not found in cache.')
# Validate that the fetched data is a dictionary
if not isinstance(landmark, Landmark):
logger.error(f'Invalid cache data format for landmark UUID: {landmark_uuid}. Expected dict, got {type(landmark).__name__}.')
raise HTTPException(status_code=500, detail="Invalid cache data format.")
return landmark
except Exception as exc:
logger.error(f'Unexpected error occurred while fetching landmark UUID {landmark_uuid}: {exc}')
raise HTTPException(status_code=500, detail="An unexpected error occurred while fetching the landmark from the cache") from exc
def load_trip_landmarks(client, first_uuid: str) -> list[Landmark]:
"""
Load all landmarks for a trip using the response from the API.
Args:
first_uuid (str) : The first UUID of the landmark.
Returns:
landmarks (list) : An list containing all landmarks for the trip.
"""
landmarks = []
next_uuid = first_uuid
while next_uuid is not None:
landmark = fetch_landmark(next_uuid)
landmarks.append(landmark)
next_uuid = landmark.next_uuid # Prepare for the next iteration
return landmarks
def log_trip_details(request, landmarks: list[Landmark], duration: int, target_duration: int) :
"""
Allows to show the detailed trip in the html test report.
Args:
request:
landmarks (list): the ordered list of visited landmarks
duration (int): the total duration of this trip
target_duration(int): the target duration of this trip
"""
trip_string = [f'{landmark.name} ({landmark.attractiveness} | {landmark.duration}) - {landmark.time_to_reach_next}' for landmark in landmarks]
# Pass additional info to pytest for reporting
request.node.trip_details = trip_string
request.node.trip_duration = str(duration) # result['total_time']
request.node.target_duration = str(target_duration)

View File

View File

@@ -0,0 +1,308 @@
"""Find clusters of interest to add more general areas of visit to the tour."""
import logging
from typing import Literal
import numpy as np
from sklearn.cluster import DBSCAN
from pydantic import BaseModel
from ..overpass.overpass import Overpass, get_base_info
from ..structs.landmark import Landmark
from .get_time_distance import get_distance
from ..constants import OSM_CACHE_DIR
# silence the overpass logger
logging.getLogger('Overpass').setLevel(level=logging.CRITICAL)
class Cluster(BaseModel):
""""
A class representing an interesting area for shopping or sightseeing.
It can represent either a general area or a specifc route with start and end point.
The importance represents the number of shops found in this cluster.
Attributes:
type : either a 'street' or 'area' (representing a denser field of shops).
importance : size of the cluster (number of points).
centroid : center of the cluster.
start : if the type is a street it goes from here...
end : ...to here
"""
type: Literal['street', 'area']
importance: int
centroid: tuple
# start: Optional[list] = None # for later use if we want to have streets as well
# end: Optional[list] = None
class ClusterManager:
"""
A manager responsible for clustering points of interest, such as shops or historic sites,
to identify areas worth visiting. It uses the DBSCAN algorithm to detect clusters
based on a set of points retrieved from OpenStreetMap (OSM).
Attributes:
logger (logging.Logger): Logger for capturing relevant events and errors.
valid (bool): Indicates whether clusters were successfully identified.
all_points (list): All points retrieved from OSM, representing locations of interest.
cluster_points (list): Points identified as part of a cluster.
cluster_labels (list): Labels corresponding to the clusters each point belongs to.
cluster_type (Literal['sightseeing', 'shopping']): Type of clustering, either for sightseeing
landmarks or shopping areas.
"""
logger = logging.getLogger(__name__)
# NOTE: all points are in (lat, lon) format
valid: bool # Ensure the manager is valid (ie there are some clusters to be found)
all_points: list
cluster_points: list
cluster_labels: list
cluster_type: Literal['sightseeing', 'shopping']
def __init__(self, bbox: tuple, cluster_type: Literal['sightseeing', 'shopping']) -> None:
"""
Upon intialization, generate the point cloud used for cluster detection.
The points represent bag/clothes shops and general boutiques.
If the first step is successful, it applies the DBSCAN clustering algorithm with different
parameters depending on the size of the city (number of points).
It filters out noise points and keeps only the largest clusters.
A successful initialization updates:
- `self.cluster_points`: The points belonging to clusters.
- `self.cluster_labels`: The labels for the points in clusters.
The method also calls `filter_clusters()` to retain only the largest clusters.
Args:
bbox: The bounding box coordinates (around:radius, center_lat, center_lon).
"""
# Setup the caching in the Overpass class.
self.overpass = Overpass(caching_strategy='XML', cache_dir=OSM_CACHE_DIR)
self.cluster_type = cluster_type
if cluster_type == 'shopping' :
osm_types = ['node']
sel = '"shop"~"^(bag|boutique|clothes)$"'
out = 'ids center'
elif cluster_type == 'sightseeing' :
osm_types = ['way']
sel = '"historic"~"^(monument|building|yes)$"'
out = 'ids center'
else :
raise NotImplementedError("Please choose only an available option for cluster detection")
# Initialize the points for cluster detection
query = self.overpass.build_query(
area = bbox,
osm_types = osm_types,
selector = sel,
out = out
)
self.logger.debug(f"Cluster query: {query}")
try:
result = self.overpass.send_query(query)
except Exception as e:
self.logger.error(f"Error fetching landmarks: {e}")
if result is None :
self.logger.error(f"Error fetching {cluster_type} clusters, overpass query returned None.")
self.valid = False
else :
points = []
for osm_type in osm_types :
for elem in result.findall(osm_type):
# Get coordinates and append them to the points list
_, coords = get_base_info(elem, osm_type)
if coords is not None :
points.append(coords)
if points :
self.all_points = np.array(points)
# Apply DBSCAN to find clusters. Choose different settings for different cities.
if self.cluster_type == 'shopping' and len(self.all_points) > 200 :
dbscan = DBSCAN(eps=0.00118, min_samples=15, algorithm='kd_tree') # for large cities
elif self.cluster_type == 'sightseeing' :
dbscan = DBSCAN(eps=0.0025, min_samples=15, algorithm='kd_tree') # for historic neighborhoods
else :
dbscan = DBSCAN(eps=0.00075, min_samples=10, algorithm='kd_tree') # for small cities
labels = dbscan.fit_predict(self.all_points)
# Check that there are is least 1 cluster
if len(set(labels)) > 1 :
self.logger.debug(f"Found {len(set(labels))} different clusters.")
# Separate clustered points and noise points
self.cluster_points = self.all_points[labels != -1]
self.cluster_labels = labels[labels != -1]
self.filter_clusters() # ValueError here sometimes. I dont know why. # Filter the clusters to keep only the largest ones.
self.valid = True
else :
self.logger.debug(f"Detected 0 {cluster_type} clusters.")
self.valid = False
else :
self.logger.debug(f"Detected 0 {cluster_type} clusters.")
self.valid = False
def generate_clusters(self) -> list[Landmark]:
"""
Generate a list of landmarks based on identified clusters.
This method iterates over the different clusters, calculates the centroid
(as the mean of the points within each cluster), and assigns an importance
based on the size of the cluster.
The generated shopping locations are stored in `self.clusters`
as a list of `Cluster` objects, each with:
- `type`: Set to 'area'.
- `centroid`: The calculated centroid of the cluster.
- `importance`: The number of points in the cluster.
"""
if not self.valid :
return [] # Return empty list if no clusters were found
locations = []
# loop through the different clusters
for label in set(self.cluster_labels):
# Extract points belonging to the current cluster
current_cluster = self.cluster_points[self.cluster_labels == label]
# Calculate the centroid as the mean of the points
centroid = np.mean(current_cluster, axis=0)
if self.cluster_type == 'shopping' :
score = len(current_cluster)*2
else :
score = len(current_cluster)*8
locations.append(Cluster(
type='area',
centroid=centroid,
importance = score
))
# Transform the locations in landmarks and return the list
cluster_landmarks = []
for cluster in locations :
cluster_landmarks.append(self.create_landmark(cluster))
return cluster_landmarks
def create_landmark(self, cluster: Cluster) -> Landmark:
"""
Create a Landmark object based on the given shopping location.
This method queries the Overpass API for nearby neighborhoods and shopping malls
within a 1000m radius around the shopping location centroid. It selects the closest
result and creates a landmark with the associated details such as name, type, and OSM ID.
Parameters:
shopping_location (Cluster): A Cluster object containing
the centroid and importance of the area.
Returns:
Landmark: A Landmark object containing details such as the name, type,
location, attractiveness, and OSM details.
"""
# Define the bounding box for a given radius around the coordinates
lat, lon = cluster.centroid
bbox = (1000, lat, lon)
# Query neighborhoods and shopping malls
selectors = ['"place"~"^(suburb|neighborhood|neighbourhood|quarter|city_block)$"']
if self.cluster_type == 'shopping' :
selectors.append('"shop"="mall"')
new_name = 'Shopping Area'
t = 40
else :
new_name = 'Neighborhood'
t = 15
min_dist = float('inf')
osm_id = 0
osm_type = 'node'
osm_types = ['node', 'way', 'relation']
for sel in selectors :
query = self.overpass.build_query(
area = bbox,
osm_types = osm_types,
selector = sel,
out = 'ids center'
)
try:
result = self.overpass.send_query(query)
except Exception as e:
self.logger.error(f"Error fetching landmarks: {e}")
continue
if result is None :
self.logger.error(f"Error fetching landmarks: {e}")
continue
for osm_type in osm_types :
for elem in result.findall(osm_type):
id, coords, name = get_base_info(elem, osm_type, with_name=True)
if name is None or coords is None :
continue
d = get_distance(cluster.centroid, coords)
if d < min_dist :
min_dist = d
new_name = name
osm_type = osm_type # Add type: 'way' or 'relation'
osm_id = id # Add OSM id
return Landmark(
name=new_name,
type=self.cluster_type,
location=cluster.centroid, # later: use the fact the we can also recognize streets.
attractiveness=cluster.importance,
n_tags=0,
osm_id=osm_id,
osm_type=osm_type,
duration=t
)
def filter_clusters(self):
"""
Filter clusters to retain only the 5 largest clusters by point count.
This method calculates the size of each cluster and filters out all but the
5 largest clusters. It then updates the cluster points and labels to reflect
only those from the top 5 clusters.
"""
label_counts = np.bincount(self.cluster_labels)
# Step 3: Get the indices (labels) of the 5 largest clusters
top_5_labels = np.argsort(label_counts)[-5:] # Get the largest 5 clusters
# Step 4: Filter points to keep only the points in the top 5 clusters
filtered_cluster_points = []
filtered_cluster_labels = []
for label in top_5_labels:
filtered_cluster_points.append(self.cluster_points[self.cluster_labels == label])
filtered_cluster_labels.append(np.full((label_counts[label],), label)) # Replicate the label
# update the cluster points and labels with the filtered data
self.cluster_points = np.vstack(filtered_cluster_points) # ValueError here
self.cluster_labels = np.concatenate(filtered_cluster_labels)

View File

@@ -0,0 +1,80 @@
"""Contains various helper functions to help with distance or score computations."""
from math import sin, cos, sqrt, atan2, radians
import yaml
from ..constants import OPTIMIZER_PARAMETERS_PATH
with OPTIMIZER_PARAMETERS_PATH.open('r') as f:
parameters = yaml.safe_load(f)
DETOUR_FACTOR = parameters['detour_factor']
AVERAGE_WALKING_SPEED = parameters['average_walking_speed']
EARTH_RADIUS_KM = 6373
def get_time(p1: tuple[float, float], p2: tuple[float, float]) -> int:
"""
Calculate the time in minutes to travel from one location to another.
Args:
p1 (tuple[float, float]): Coordinates of the starting location.
p2 (tuple[float, float]): Coordinates of the destination.
Returns:
int: Time to travel from p1 to p2 in minutes.
"""
# if p1 == p2:
# return 0
# else:
# Compute the distance in km along the surface of the Earth
# (assume spherical Earth)
# this is the haversine formula, stolen from stackoverflow
# in order to not use any external libraries
lat1, lon1 = radians(p1[0]), radians(p1[1])
lat2, lon2 = radians(p2[0]), radians(p2[1])
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat / 2)**2 + cos(lat1) * cos(lat2) * sin(dlon / 2)**2
c = 2 * atan2(sqrt(a), sqrt(1 - a))
distance = EARTH_RADIUS_KM * c
# Consider the detour factor for average an average city
walk_distance = distance * DETOUR_FACTOR
# Time to walk this distance (in minutes)
walk_time = walk_distance / AVERAGE_WALKING_SPEED * 60
return min(round(walk_time), 32765)
def get_distance(p1: tuple[float, float], p2: tuple[float, float]) -> int:
"""
Calculate the time in minutes to travel from one location to another.
Args:
p1 (tuple[float, float]): Coordinates of the starting location.
p2 (tuple[float, float]): Coordinates of the destination.
Returns:
int: Time to travel from p1 to p2 in minutes.
"""
if p1 == p2:
return 0
# Compute the distance in km along the surface of the Earth
# (assume spherical Earth)
# this is the haversine formula, stolen from stackoverflow
# in order to not use any external libraries
lat1, lon1 = radians(p1[0]), radians(p1[1])
lat2, lon2 = radians(p2[0]), radians(p2[1])
dlon = lon2 - lon1
dlat = lat2 - lat1
a = sin(dlat / 2)**2 + cos(lat1) * cos(lat2) * sin(dlon / 2)**2
c = 2 * atan2(sqrt(a), sqrt(1 - a))
return EARTH_RADIUS_KM * c

View File

@@ -0,0 +1,325 @@
"""Module used to import data from OSM and arrange them in categories."""
import logging
import xml.etree.ElementTree as ET
import yaml
from ..structs.preferences import Preferences
from ..structs.landmark import Landmark
from .take_most_important import take_most_important
from .cluster_manager import ClusterManager
from ..overpass.overpass import Overpass, get_base_info
from ..constants import AMENITY_SELECTORS_PATH, LANDMARK_PARAMETERS_PATH, OPTIMIZER_PARAMETERS_PATH, OSM_CACHE_DIR
# silence the overpass logger
logging.getLogger('Overpass').setLevel(level=logging.CRITICAL)
class LandmarkManager:
"""
Use this to manage landmarks.
Uses the overpass api to fetch landmarks and classify them.
"""
logger = logging.getLogger(__name__)
radius_close_to: int # radius in meters
church_coeff: float # coeff to adjsut score of churches
nature_coeff: float # coeff to adjust score of parks
overall_coeff: float # coeff to adjust weight of tags
n_important: int # number of important landmarks to consider
def __init__(self) -> None:
with AMENITY_SELECTORS_PATH.open('r') as f:
self.amenity_selectors = yaml.safe_load(f)
with LANDMARK_PARAMETERS_PATH.open('r') as f:
parameters = yaml.safe_load(f)
self.max_bbox_side = parameters['city_bbox_side']
self.radius_close_to = parameters['radius_close_to']
self.church_coeff = parameters['church_coeff']
self.nature_coeff = parameters['nature_coeff']
self.overall_coeff = parameters['overall_coeff']
self.tag_exponent = parameters['tag_exponent']
self.image_bonus = parameters['image_bonus']
self.name_bonus = parameters['name_bonus']
self.wikipedia_bonus = parameters['wikipedia_bonus']
self.viewpoint_bonus = parameters['viewpoint_bonus']
self.pay_bonus = parameters['pay_bonus']
self.n_important = parameters['N_important']
with OPTIMIZER_PARAMETERS_PATH.open('r') as f:
parameters = yaml.safe_load(f)
self.walking_speed = parameters['average_walking_speed']
self.detour_factor = parameters['detour_factor']
# Setup the caching in the Overpass class.
self.overpass = Overpass(caching_strategy='XML', cache_dir=OSM_CACHE_DIR)
self.logger.info('LandmakManager successfully initialized.')
def generate_landmarks_list(self, center_coordinates: tuple[float, float], preferences: Preferences) -> tuple[list[Landmark], list[Landmark]]:
"""
Generate and prioritize a list of landmarks based on user preferences.
This method fetches landmarks from various categories (sightseeing, nature, shopping) based on the user's preferences
and current location. It scores and corrects these landmarks, removes duplicates, and then selects the most important
landmarks based on a predefined criterion.
Args:
center_coordinates (tuple[float, float]): The latitude and longitude of the center location around which to search.
preferences (Preferences): The user's preference settings that influence the landmark selection.
Returns:
tuple[list[Landmark], list[Landmark]]:
- A list of all existing landmarks.
- A list of the most important landmarks based on the user's preferences.
"""
self.logger.debug('Starting to fetch landmarks...')
max_walk_dist = int((preferences.max_time_minute/2)/60*self.walking_speed*1000/self.detour_factor)
reachable_bbox_side = min(max_walk_dist, self.max_bbox_side)
# use set to avoid duplicates, this requires some __methods__ to be set in Landmark
all_landmarks = set()
# Create a bbox using the around technique, tuple of strings
bbox = tuple((min(2000, reachable_bbox_side/2), center_coordinates[0], center_coordinates[1]))
# list for sightseeing
if preferences.sightseeing.score != 0:
self.logger.debug('Fetching sightseeing landmarks...')
current_landmarks = self.fetch_landmarks(bbox, self.amenity_selectors['sightseeing'], preferences.sightseeing.type, preferences.sightseeing.score)
all_landmarks.update(current_landmarks)
self.logger.debug('Fetching sightseeing clusters...')
# special pipeline for historic neighborhoods
neighborhood_manager = ClusterManager(bbox, 'sightseeing')
historic_clusters = neighborhood_manager.generate_clusters()
all_landmarks.update(historic_clusters)
self.logger.debug('Sightseeing clusters done')
# list for nature
if preferences.nature.score != 0:
self.logger.debug('Fetching nature landmarks...')
current_landmarks = self.fetch_landmarks(bbox, self.amenity_selectors['nature'], preferences.nature.type, preferences.nature.score)
all_landmarks.update(current_landmarks)
# list for shopping
if preferences.shopping.score != 0:
self.logger.debug('Fetching shopping landmarks...')
current_landmarks = self.fetch_landmarks(bbox, self.amenity_selectors['shopping'], preferences.shopping.type, preferences.shopping.score)
self.logger.debug('Fetching shopping clusters...')
# set time for all shopping activites :
for landmark in current_landmarks :
landmark.duration = 30
all_landmarks.update(current_landmarks)
# special pipeline for shopping malls
shopping_manager = ClusterManager(bbox, 'shopping')
shopping_clusters = shopping_manager.generate_clusters()
all_landmarks.update(shopping_clusters)
self.logger.debug('Shopping clusters done')
landmarks_constrained = take_most_important(all_landmarks, self.n_important)
# self.logger.info(f'All landmarks generated : {len(all_landmarks)} landmarks around {center_coordinates}, and constrained to {len(landmarks_constrained)} most important ones.')
return all_landmarks, landmarks_constrained
def set_landmark_score(self, landmark: Landmark, landmarktype: str, preference_level: int) :
"""
Calculate and set the attractiveness score for a given landmark.
This method evaluates the landmark's attractiveness based on its properties
(number of tags, presence of Wikipedia URL, image, website, and whether it's
a place of worship) and adjusts the score using the user's preference level.
Args:
landmark (Landmark): The landmark object to score.
landmarktype (str): The type of the landmark (currently unused).
preference_level (int): The user's preference level for this landmark type.
"""
score = landmark.n_tags**self.tag_exponent
if landmark.wiki_url :
score *= self.wikipedia_bonus
if landmark.image_url :
score *= self.image_bonus
if landmark.website_url :
score *= self.wikipedia_bonus
if landmark.is_place_of_worship :
score *= self.church_coeff
if landmarktype == 'nature' :
score *= self.nature_coeff
landmark.attractiveness = int(score * preference_level * 2)
def fetch_landmarks(self, bbox: tuple, amenity_selector: dict, landmarktype: str, preference_level: int) -> list[Landmark]:
"""
Fetches landmarks of a specified type from OpenStreetMap (OSM) within a bounding box centered on given coordinates.
Args:
bbox (tuple[float, float, float, float]): The bounding box coordinates (around:radius, center_lat, center_lon).
amenity_selector (dict): The Overpass API query selector for the desired landmark type.
landmarktype (str): The type of the landmark (e.g., 'sightseeing', 'nature', 'shopping').
Returns:
list[Landmark]: A list of Landmark objects that were fetched and filtered based on the provided criteria.
Notes:
- Landmarks are fetched using Overpass API queries.
- Selectors are translated from the dictionary to the Overpass query format. (e.g., 'amenity'='place_of_worship')
- Landmarks are filtered based on various conditions including tags and type.
"""
return_list = []
if landmarktype == 'nature' : query_conditions = []
else : query_conditions = ['count_tags()>5']
# caution, when applying a list of selectors, overpass will search for elements that match ALL selectors simultaneously
# we need to split the selectors into separate queries and merge the results
for sel in dict_to_selector_list(amenity_selector):
# self.logger.debug(f"Current selector: {sel}")
osm_types = ['way', 'relation']
if 'viewpoint' in sel :
query_conditions = []
osm_types.append('node')
query = self.overpass.build_query(
area = bbox,
osm_types = osm_types,
selector = sel,
conditions = query_conditions, # except for nature....
out = 'center'
)
self.logger.debug(f"Query: {query}")
try:
result = self.overpass.send_query(query)
except Exception as e:
self.logger.error(f"Error fetching landmarks: {e}")
continue
return_list += self.xml_to_landmarks(result, landmarktype, preference_level)
self.logger.debug(f"Fetched {len(return_list)} landmarks of type {landmarktype} in {bbox}")
return return_list
def xml_to_landmarks(self, root: ET.Element, landmarktype, preference_level) -> list[Landmark]:
"""
Parse the Overpass API result and extract landmarks.
This method processes the XML root element returned by the Overpass API and
extracts landmarks of types 'node', 'way', and 'relation'. It retrieves
relevant information such as name, coordinates, and tags, and converts them
into Landmark objects.
Args:
root (ET.Element): The root element of the XML response from Overpass API.
elem_type (str): The type of landmark (e.g., node, way, relation).
Returns:
list[Landmark]: A list of Landmark objects extracted from the XML data.
"""
if root is None :
return []
landmarks = []
for osm_type in ['node', 'way', 'relation'] :
for elem in root.findall(osm_type):
id, coords, name = get_base_info(elem, osm_type, with_name=True)
if name is None or coords is None :
continue
tags = elem.findall('tag')
# Convert this to Landmark object
landmark = Landmark(name=name,
type=landmarktype,
location=coords,
osm_id=id,
osm_type=osm_type,
attractiveness=0,
n_tags=len(tags))
# Browse through tags to add information to landmark.
for tag in tags:
key = tag.get('k')
value = tag.get('v')
# Skip this landmark if not suitable.
if key == 'building:part' and value == 'yes' :
break
if 'disused:' in key :
break
if 'boundary:' in key :
break
if 'shop' in key and landmarktype != 'shopping' :
break
# if value == 'apartments' :
# break
# Fill in the other attributes.
if key == 'image' :
landmark.image_url = value
if key == 'website' :
landmark.website_url = value
if key == 'place_of_worship' :
landmark.is_place_of_worship = True
if key == 'wikipedia' :
landmark.wiki_url = value
if key == 'name:en' :
landmark.name_en = value
if 'building:' in key or 'pay' in key :
landmark.n_tags -= 1
# Set the duration.
if value in ['museum', 'aquarium', 'planetarium'] :
landmark.duration = 60
elif value == 'viewpoint' :
landmark.is_viewpoint = True
landmark.duration = 10
elif value == 'cathedral' :
landmark.is_place_of_worship = False
landmark.duration = 10
else:
self.set_landmark_score(landmark, landmarktype, preference_level)
landmarks.append(landmark)
continue
return landmarks
def dict_to_selector_list(d: dict) -> list:
"""
Convert a dictionary of key-value pairs to a list of Overpass query strings.
Args:
d (dict): A dictionary of key-value pairs representing the selector.
Returns:
list: A list of strings representing the Overpass query selectors.
"""
return_list = []
for key, value in d.items():
if isinstance(value, list):
val = '|'.join(value)
return_list.append(f'{key}~"^({val})$"')
elif isinstance(value, str) and len(value) == 0:
return_list.append(f'{key}')
else:
return_list.append(f'{key}={value}')
return return_list

View File

@@ -0,0 +1,17 @@
"""Helper function to return only the major landmarks from a large list."""
from ..structs.landmark import Landmark
def take_most_important(landmarks: list[Landmark], n_important) -> list[Landmark]:
"""
Given a list of landmarks, return the n_important most important landmarks
Args:
landmarks: list[Landmark] - list of landmarks
n_important: int - number of most important landmarks to return
Returns:
list[Landmark] - list of the n_important most important landmarks
"""
# Sort landmarks by attractiveness (descending)
sorted_landmarks = sorted(landmarks, key=lambda x: x.attractiveness, reverse=True)
return sorted_landmarks[:n_important]

View File

@@ -0,0 +1,125 @@
"""Module for finding public toilets around given coordinates."""
import logging
import xml.etree.ElementTree as ET
from ..overpass.overpass import Overpass, get_base_info
from ..structs.landmark import Toilets
from ..constants import OSM_CACHE_DIR
# silence the overpass logger
logging.getLogger('Overpass').setLevel(level=logging.CRITICAL)
class ToiletsManager:
"""
Manages the process of fetching and caching toilet information from
OpenStreetMap (OSM) based on a specified location and radius.
This class is responsible for:
- Fetching toilet data from OSM using Overpass API around a given set of
coordinates (latitude, longitude).
- Using a caching strategy to optimize requests by saving and retrieving
data from a local cache.
- Logging important events and errors related to data fetching.
Attributes:
logger (logging.Logger): Logger for the class to capture events.
location (tuple[float, float]): Latitude and longitude representing the
location to search around.
radius (int): The search radius in meters for finding nearby toilets.
overpass (Overpass): The Overpass API instance used to query OSM.
"""
logger = logging.getLogger(__name__)
location: tuple[float, float]
radius: int # radius in meters
def __init__(self, location: tuple[float, float], radius : int) -> None:
self.radius = radius
self.location = location
# Setup the caching in the Overpass class.
self.overpass = Overpass(caching_strategy='XML', cache_dir=OSM_CACHE_DIR)
def generate_toilet_list(self) -> list[Toilets] :
"""
Generates a list of toilet locations by fetching data from OpenStreetMap (OSM)
around the given coordinates stored in `self.location`.
Returns:
list[Toilets]: A list of `Toilets` objects containing detailed information
about the toilets found around the given coordinates.
"""
bbox = tuple((self.radius, self.location[0], self.location[1]))
osm_types = ['node', 'way', 'relation']
toilets_list = []
query = self.overpass.build_query(
area = bbox,
osm_types = osm_types,
selector = '"amenity"="toilets"',
out = 'ids center tags'
)
self.logger.debug(f"Query: {query}")
try:
result = self.overpass.send_query(query)
except Exception as e:
self.logger.error(f"Error fetching landmarks: {e}")
return None
toilets_list = self.xml_to_toilets(result)
return toilets_list
def xml_to_toilets(self, root: ET.Element) -> list[Toilets]:
"""
Parse the Overpass API result and extract landmarks.
This method processes the XML root element returned by the Overpass API and
extracts landmarks of types 'node', 'way', and 'relation'. It retrieves
relevant information such as name, coordinates, and tags, and converts them
into Landmark objects.
Args:
root (ET.Element): The root element of the XML response from Overpass API.
elem_type (str): The type of landmark (e.g., node, way, relation).
Returns:
list[Landmark]: A list of Landmark objects extracted from the XML data.
"""
if root is None :
return []
toilets_list = []
for osm_type in ['node', 'way', 'relation'] :
for elem in root.findall(osm_type):
# Get coordinates and append them to the points list
_, coords = get_base_info(elem, osm_type)
if coords is None :
continue
toilets = Toilets(location=coords)
# Extract tags as a dictionary
tags = {tag.get('k'): tag.get('v') for tag in elem.findall('tag')}
if 'wheelchair' in tags.keys() and tags['wheelchair'] == 'yes':
toilets.wheelchair = True
if 'changing_table' in tags.keys() and tags['changing_table'] == 'yes':
toilets.changing_table = True
if 'fee' in tags.keys() and tags['fee'] == 'yes':
toilets.fee = True
if 'opening_hours' in tags.keys() :
toilets.opening_hours = tags['opening_hours']
toilets_list.append(toilets)
return toilets_list

View File

@@ -0,0 +1,59 @@
on:
push:
tags:
- 'v*'
jobs:
build:
runs-on: macos-latest
steps:
- uses: actions/checkout@v4
- name: Set up ruby env
uses: ruby/setup-ruby@v1
with:
ruby-version: 3.2.1
bundler-cache: true
- name: Setup java for android build
uses: actions/setup-java@v4
with:
java-version: '17'
distribution: 'zulu'
- name: Setup android SDK
uses: android-actions/setup-android@v3
- name: Install Flutter
uses: subosito/flutter-action@v2
with:
channel: stable
flutter-version: 3.22.0
cache: true
- name: Infer version number from git tag
id: version
env:
REF_NAME: ${{ github.ref_name }}
run:
# remove the 'v' prefix from the tag name
echo "BUILD_NAME=${REF_NAME//v}" >> $GITHUB_ENV
- name: Put selected secrets into files
run: |
echo "${{ secrets.ANDROID_SECRET_PROPERTIES_BASE64 }}" | base64 -d > secrets.properties
echo "${{ secrets.ANDROID_GOOGLE_PLAY_JSON_BASE64 }}" | base64 -d > google-key.json
echo "${{ secrets.ANDROID_KEYSTORE_BASE64 }}" | base64 -d > release.keystore
working-directory: android
- name: Install fastlane
run: bundle install
working-directory: android
- name: Run fastlane lane
run: bundle exec fastlane deploy_release
working-directory: android
env:
BUILD_NUMBER: ${{ github.run_number }}
# BUILD_NAME is implicitly available
GOOGLE_MAPS_API_KEY: ${{ secrets.GOOGLE_MAPS_API_KEY }}

View File

@@ -0,0 +1,64 @@
on:
push:
tags:
- 'v*'
jobs:
build:
runs-on: macos-latest
env:
# $BUNDLE_GEMFILE must be set at the job level, so it is set for all steps
BUNDLE_GEMFILE: ${{ github.workspace }}/ios/Gemfile
steps:
- uses: actions/checkout@v4
- name: Set up ruby env
uses: ruby/setup-ruby@v1
with:
ruby-version: 3.3
bundler-cache: true # runs 'bundle install' and caches installed gems automatically
- name: Install Flutter
uses: subosito/flutter-action@v2
with:
channel: stable
flutter-version: 3.22.0
cache: true
- name: Infer version number from git tag
id: version
env:
REF_NAME: ${{ github.ref_name }}
run:
# remove the 'v' prefix from the tag name
echo "BUILD_NAME=${REF_NAME//v}" >> $GITHUB_ENV
- name: Setup SSH key for match git repo
# and mark the host as known
run: |
echo $MATCH_REPO_SSH_KEY | base64 --decode > ~/.ssh/id_rsa
chmod 600 ~/.ssh/id_rsa
ssh-keyscan -p 2222 git.kluster.moll.re > ~/.ssh/known_hosts
env:
MATCH_REPO_SSH_KEY: ${{ secrets.IOS_MATCH_REPO_SSH_KEY_BASE64 }}
- name: Install dependencies and clean up
run: |
flutter pub get
bundle exec pod install
flutter clean
bundle exec pod cache clean --all
working-directory: ios
- name: Run fastlane lane
run: bundle exec fastlane deploy_release --verbose
working-directory: ios
env:
BUILD_NUMBER: ${{ github.run_number }}
# BUILD_NAME is implicitly available
GOOGLE_MAPS_API_KEY: ${{ secrets.GOOGLE_MAPS_API_KEY }}
IOS_ASC_KEY_ID: ${{ secrets.IOS_ASC_KEY_ID }}
IOS_ASC_ISSUER_ID: ${{ secrets.IOS_ASC_ISSUER_ID }}
IOS_ASC_KEY: ${{ secrets.IOS_ASC_KEY }}
MATCH_PASSWORD: ${{ secrets.IOS_MATCH_PASSWORD }}
IOS_GOOGLE_MAPS_API_KEY: ${{ secrets.IOS_GOOGLE_MAPS_API_KEY }}

61
frontend/README.md Normal file
View File

@@ -0,0 +1,61 @@
# Frontend
The frontend of this project is a Flutter application designed to run on both Android and iOS devices (and possibly as a PWA). The frontend is responsible for displaying the user interface and handling user input. It communicates with the backend via a REST-api to retrieve and send data.
## Getting Started
The flutter application is divided into multiple chunks of code.
- the `lib` directory contains the main code of the application.
- the `android` and `ios` directories contain platform-specific code.
- the root directory contains configuration files and metadata.
To run the application, you need to have the Flutter SDK installed. You can find instructions on how to do this [here](https://flutter.dev/docs/get-started/install).
Once you have the Flutter SDK installed, you can locally install the dependencies by running:
```bash
flutter pub get
```
## Development
### ...
### Icons and logos
The application uses a custom launcher icon and splash screen. These are managed platform-independently using the `flutter_launcher_icons` package.
To update the icons, change the `flutter_launcher_icons.yaml` configuration file. Especially the `image_path` is relevant. Then run
```bash
dart run flutter_launcher_icons
```
### Deploying a new version
To truly deploy a new version of the application, i.e. to the official app stores, a special CI step is required. This listens for new tags. To create a new tag position yourself on the main branch and run
```bash
git tag -a v<name> -m "Release <name>"
git push origin v<name>
```
We adhere to the [Semantic Versioning](https://semver.org/) standard, so the tag should be of the form `v0.1.8` for example.
## Fastlane - in depth
The application is deployed to the Google Play Store and the Apple App Store using fastlane: [https://docs.fastlane.tools/](https://docs.fastlane.tools/)
Fastlane is installed as a Ruby gem. Since the bundler-gemfile is scoped to a single directory, a `Gemfile` is included in both the `android` and `ios` directories. Once installed, the usage is
```bash
cd frontend/android # or ios
bundle install
bundle exec fastlane <lane>
```
This is reused in the CI/CD pipeline to automate the deployment process.
Secrets used by fastlane are stored on hashicorp vault and are fetched by the CI/CD pipeline. See below.
## Secrets
These are mostly used by the CI/CD pipeline to deploy the application. The main usage for github actions is documented under [https://github.com/hashicorp/vault-action](https://github.com/hashicorp/vault-action).
**Platform-specific secrets** are used by the CI/CD pipeline to deploy to the respective app stores.
- `GOOGLE_MAPS_API_KEY` is used to authenticate with the Google Maps API and is scoped to the android platform
- `ANDROID_KEYSTORE` is used to sign the android apk
- `ANDROID_GOOGLE_KEY` is used to authenticate with the Google Play Store api
- `IOS_GOOGLE_MAPS_API_KEY` is used to authenticate with the Google Maps API and is scoped to the ios platform
- `IOS_GOOGLE_...`
- `IOS_GOOGLE_...`
- `IOS_GOOGLE_...`

View File

@@ -1,9 +1,10 @@
gradle-wrapper.jar
gradlew
gradlew.bat
gradle/
/.gradle
/captures/
/gradlew
/gradlew.bat
/local.properties
/secrets.properties
GeneratedPluginRegistrant.java
# Remember to never publicly share your keystore.
@@ -11,3 +12,6 @@ GeneratedPluginRegistrant.java
key.properties
**/*.keystore
**/*.jks
# Fastlane google cloud access
google-key.json

3
frontend/android/Gemfile Normal file
View File

@@ -0,0 +1,3 @@
source "https://rubygems.org"
gem "fastlane"

View File

@@ -0,0 +1,220 @@
GEM
remote: https://rubygems.org/
specs:
CFPropertyList (3.0.7)
base64
nkf
rexml
addressable (2.8.7)
public_suffix (>= 2.0.2, < 7.0)
artifactory (3.0.17)
atomos (0.1.3)
aws-eventstream (1.3.0)
aws-partitions (1.970.0)
aws-sdk-core (3.202.2)
aws-eventstream (~> 1, >= 1.3.0)
aws-partitions (~> 1, >= 1.651.0)
aws-sigv4 (~> 1.9)
jmespath (~> 1, >= 1.6.1)
aws-sdk-kms (1.88.0)
aws-sdk-core (~> 3, >= 3.201.0)
aws-sigv4 (~> 1.5)
aws-sdk-s3 (1.159.0)
aws-sdk-core (~> 3, >= 3.201.0)
aws-sdk-kms (~> 1)
aws-sigv4 (~> 1.5)
aws-sigv4 (1.9.1)
aws-eventstream (~> 1, >= 1.0.2)
babosa (1.0.4)
base64 (0.2.0)
claide (1.1.0)
colored (1.2)
colored2 (3.1.2)
commander (4.6.0)
highline (~> 2.0.0)
declarative (0.0.20)
digest-crc (0.6.5)
rake (>= 12.0.0, < 14.0.0)
domain_name (0.6.20240107)
dotenv (2.8.1)
emoji_regex (3.2.3)
excon (0.111.0)
faraday (1.10.3)
faraday-em_http (~> 1.0)
faraday-em_synchrony (~> 1.0)
faraday-excon (~> 1.1)
faraday-httpclient (~> 1.0)
faraday-multipart (~> 1.0)
faraday-net_http (~> 1.0)
faraday-net_http_persistent (~> 1.0)
faraday-patron (~> 1.0)
faraday-rack (~> 1.0)
faraday-retry (~> 1.0)
ruby2_keywords (>= 0.0.4)
faraday-cookie_jar (0.0.7)
faraday (>= 0.8.0)
http-cookie (~> 1.0.0)
faraday-em_http (1.0.0)
faraday-em_synchrony (1.0.0)
faraday-excon (1.1.0)
faraday-httpclient (1.0.1)
faraday-multipart (1.0.4)
multipart-post (~> 2)
faraday-net_http (1.0.2)
faraday-net_http_persistent (1.2.0)
faraday-patron (1.0.0)
faraday-rack (1.0.0)
faraday-retry (1.0.3)
faraday_middleware (1.2.0)
faraday (~> 1.0)
fastimage (2.3.1)
fastlane (2.222.0)
CFPropertyList (>= 2.3, < 4.0.0)
addressable (>= 2.8, < 3.0.0)
artifactory (~> 3.0)
aws-sdk-s3 (~> 1.0)
babosa (>= 1.0.3, < 2.0.0)
bundler (>= 1.12.0, < 3.0.0)
colored (~> 1.2)
commander (~> 4.6)
dotenv (>= 2.1.1, < 3.0.0)
emoji_regex (>= 0.1, < 4.0)
excon (>= 0.71.0, < 1.0.0)
faraday (~> 1.0)
faraday-cookie_jar (~> 0.0.6)
faraday_middleware (~> 1.0)
fastimage (>= 2.1.0, < 3.0.0)
gh_inspector (>= 1.1.2, < 2.0.0)
google-apis-androidpublisher_v3 (~> 0.3)
google-apis-playcustomapp_v1 (~> 0.1)
google-cloud-env (>= 1.6.0, < 2.0.0)
google-cloud-storage (~> 1.31)
highline (~> 2.0)
http-cookie (~> 1.0.5)
json (< 3.0.0)
jwt (>= 2.1.0, < 3)
mini_magick (>= 4.9.4, < 5.0.0)
multipart-post (>= 2.0.0, < 3.0.0)
naturally (~> 2.2)
optparse (>= 0.1.1, < 1.0.0)
plist (>= 3.1.0, < 4.0.0)
rubyzip (>= 2.0.0, < 3.0.0)
security (= 0.1.5)
simctl (~> 1.6.3)
terminal-notifier (>= 2.0.0, < 3.0.0)
terminal-table (~> 3)
tty-screen (>= 0.6.3, < 1.0.0)
tty-spinner (>= 0.8.0, < 1.0.0)
word_wrap (~> 1.0.0)
xcodeproj (>= 1.13.0, < 2.0.0)
xcpretty (~> 0.3.0)
xcpretty-travis-formatter (>= 0.0.3, < 2.0.0)
gh_inspector (1.1.3)
google-apis-androidpublisher_v3 (0.54.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-core (0.11.3)
addressable (~> 2.5, >= 2.5.1)
googleauth (>= 0.16.2, < 2.a)
httpclient (>= 2.8.1, < 3.a)
mini_mime (~> 1.0)
representable (~> 3.0)
retriable (>= 2.0, < 4.a)
rexml
google-apis-iamcredentials_v1 (0.17.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-playcustomapp_v1 (0.13.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-storage_v1 (0.31.0)
google-apis-core (>= 0.11.0, < 2.a)
google-cloud-core (1.7.1)
google-cloud-env (>= 1.0, < 3.a)
google-cloud-errors (~> 1.0)
google-cloud-env (1.6.0)
faraday (>= 0.17.3, < 3.0)
google-cloud-errors (1.4.0)
google-cloud-storage (1.47.0)
addressable (~> 2.8)
digest-crc (~> 0.4)
google-apis-iamcredentials_v1 (~> 0.1)
google-apis-storage_v1 (~> 0.31.0)
google-cloud-core (~> 1.6)
googleauth (>= 0.16.2, < 2.a)
mini_mime (~> 1.0)
googleauth (1.8.1)
faraday (>= 0.17.3, < 3.a)
jwt (>= 1.4, < 3.0)
multi_json (~> 1.11)
os (>= 0.9, < 2.0)
signet (>= 0.16, < 2.a)
highline (2.0.3)
http-cookie (1.0.7)
domain_name (~> 0.5)
httpclient (2.8.3)
jmespath (1.6.2)
json (2.7.2)
jwt (2.8.2)
base64
mini_magick (4.13.2)
mini_mime (1.1.5)
multi_json (1.15.0)
multipart-post (2.4.1)
nanaimo (0.3.0)
naturally (2.2.1)
nkf (0.2.0)
optparse (0.5.0)
os (1.1.4)
plist (3.7.1)
public_suffix (6.0.1)
rake (13.2.1)
representable (3.2.0)
declarative (< 0.1.0)
trailblazer-option (>= 0.1.1, < 0.2.0)
uber (< 0.2.0)
retriable (3.1.2)
rexml (3.3.6)
strscan
rouge (2.0.7)
ruby2_keywords (0.0.5)
rubyzip (2.3.2)
security (0.1.5)
signet (0.19.0)
addressable (~> 2.8)
faraday (>= 0.17.5, < 3.a)
jwt (>= 1.5, < 3.0)
multi_json (~> 1.10)
simctl (1.6.10)
CFPropertyList
naturally
strscan (3.1.0)
terminal-notifier (2.0.0)
terminal-table (3.0.2)
unicode-display_width (>= 1.1.1, < 3)
trailblazer-option (0.1.2)
tty-cursor (0.7.1)
tty-screen (0.8.2)
tty-spinner (0.9.3)
tty-cursor (~> 0.7)
uber (0.1.0)
unicode-display_width (2.5.0)
word_wrap (1.0.0)
xcodeproj (1.25.0)
CFPropertyList (>= 2.3.3, < 4.0)
atomos (~> 0.1.3)
claide (>= 1.0.2, < 2.0)
colored2 (~> 3.1)
nanaimo (~> 0.3.0)
rexml (>= 3.3.2, < 4.0)
xcpretty (0.3.0)
rouge (~> 2.0.7)
xcpretty-travis-formatter (1.0.1)
xcpretty (~> 0.2, >= 0.0.7)
PLATFORMS
ruby
x86_64-linux
DEPENDENCIES
fastlane
BUNDLED WITH
2.5.18

View File

@@ -0,0 +1,65 @@
## Android Setup
### Keystore setup
```bash
keytool -genkey -v -keystore release.keystore -keyalg RSA -keysize 2048 -validity 10000 -alias upload
```
- This is required to store local credentials securely and more importantly to sign the app for google play store distribution.
### Using secret credentials during build
Following the guide under [https://developers.google.com/maps/flutter-package/config#android_1](https://developers.google.com/maps/flutter-package/config#android_1).
- Add the following to `android/build.gradle`:
```gradle
buildscript {
dependencies {
classpath "com.google.android.libraries.mapsplatform.secrets-gradle-plugin:secrets-gradle-plugin:2.0.1"
}
}
```
- Add the following to `android/app/build.gradle`:
```gradle
plugins {
// ...
id 'com.google.android.libraries.mapsplatform.secrets-gradle-plugin'
}
```
- Add the credentials to `android/secrets.properties`:
```properties
MAPS_API_KEY=YOUR_API_KEY
```
- Reference the credentials in `android/app/src/main/AndroidManifest.xml`:
```xml
<meta-data
android:name="com.google.android.geo.API_KEY"
android:value="${MAPS_API_KEY}" />
```
### Signing the app
Compared to the flutter template application, a few changes have to be made:
- Added to `android/app/build.gradle`:
```gradle
signingConfigs {
release {
keyAlias = secretProperties['keyAlias']
keyPassword = secretProperties['keyPassword']
storeFile = secretProperties['storeFile'] ? file(secretProperties['storeFile']) : null
storePassword = secretProperties['storePassword']
}
}
```
- Changed the `buildTypes` to use the `release` signing config:
```gradle
buildTypes {
release {
signingConfig signingConfigs.release
}
}
```
This makes use of the `secretProperties` defined previously:
```gradle
secretPropertiesFile.withReader('UTF-8') { reader ->
secretProperties.load(reader)
}
```

View File

@@ -2,14 +2,19 @@ plugins {
id "com.android.application"
id "kotlin-android"
id "dev.flutter.flutter-gradle-plugin"
id 'com.google.android.libraries.mapsplatform.secrets-gradle-plugin'
// last is probably not needed
}
def localProperties = new Properties()
def localPropertiesFile = rootProject.file('local.properties')
def localProperties = new Properties()
if (localPropertiesFile.exists()) {
localPropertiesFile.withReader('UTF-8') { reader ->
localProperties.load(reader)
}
} else {
throw new GradleException("local.properties not found")
}
def flutterVersionCode = localProperties.getProperty('flutter.versionCode')
@@ -22,8 +27,27 @@ if (flutterVersionName == null) {
flutterVersionName = '1.0'
}
def secretPropertiesFile = rootProject.file('secrets.properties')
def fallbackPropertiesFile = rootProject.file('fallback.properties')
def secretProperties = new Properties()
if (secretPropertiesFile.exists()) {
secretPropertiesFile.withReader('UTF-8') { reader ->
secretProperties.load(reader)
}
} else if (fallbackPropertiesFile.exists()) {
fallbackPropertiesFile.withReader('UTF-8') { reader ->
secretProperties.load(reader)
}
} else {
throw new GradleException("Secrets file (secrets.properties, fallback.properties) not found")
}
android {
namespace "com.example.fast_network_navigation"
namespace "com.anydev.anyway"
compileSdk flutter.compileSdkVersion
ndkVersion flutter.ndkVersion
@@ -41,8 +65,8 @@ android {
}
defaultConfig {
// TODO: Specify your own unique Application ID (https://developer.android.com/studio/build/application-id.html).
applicationId "com.example.fast_network_navigation"
applicationId "com.anydev.anyway"
// You can update the following values to match your application needs.
// For more information, see: https://docs.flutter.dev/deployment/android#reviewing-the-gradle-build-configuration.
// Minimum Android version for Google Maps SDK
@@ -52,13 +76,23 @@ android {
targetSdkVersion flutter.targetSdkVersion
versionCode flutterVersionCode.toInteger()
versionName flutterVersionName
// // Placeholders of keys that are replaced by the build system.
manifestPlaceholders += ['MAPS_API_KEY': System.getenv('GOOGLE_MAPS_API_KEY')]
}
signingConfigs {
release {
keyAlias = secretProperties['keyAlias']
keyPassword = secretProperties['keyPassword']
storeFile = secretProperties['storeFile'] ? file(secretProperties['storeFile']) : null
storePassword = secretProperties['storePassword']
}
}
buildTypes {
release {
// TODO: Add your own signing config for the release build.
// Signing with the debug keys for now, so `flutter run --release` works.
signingConfig signingConfigs.debug
signingConfig = signingConfigs.release
}
}
}

View File

@@ -1,6 +1,12 @@
<manifest xmlns:android="http://schemas.android.com/apk/res/android">
<!-- Required to fetch data from the internet. -->
<uses-permission android:name="android.permission.INTERNET"/>
<!-- Required to show user location -->
<uses-permission android:name="android.permission.ACCESS_FINE_LOCATION"/>
<uses-permission android:name="android.permission.ACCESS_COARSE_LOCATION" />
<application
android:label="fast_network_navigation"
android:label="anyway"
android:name="${applicationName}"
android:icon="@mipmap/ic_launcher">
<activity
@@ -29,9 +35,13 @@
<meta-data
android:name="flutterEmbedding"
android:value="2"
/>
<meta-data
android:name="com.google.android.geo.API_KEY"
android:value="AIzaSyCeWk_D2xvfOHLidvV56EZeQCUybypEntw"/> />
android:value="${MAPS_API_KEY}"
/>
</application>
<!-- Required to query activities that can process text, see:
https://developer.android.com/training/package-visibility?hl=en and
https://developer.android.com/reference/android/content/Intent#ACTION_PROCESS_TEXT.

View File

@@ -1,4 +1,4 @@
package com.example.fast_network_navigation
package com.anydev.anyway
import io.flutter.embedding.android.FlutterActivity

Binary file not shown.

Before

Width:  |  Height:  |  Size: 544 B

After

Width:  |  Height:  |  Size: 3.6 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 442 B

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 721 B

After

Width:  |  Height:  |  Size: 5.2 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.0 KiB

After

Width:  |  Height:  |  Size: 9.4 KiB

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.4 KiB

After

Width:  |  Height:  |  Size: 13 KiB

View File

@@ -16,3 +16,14 @@ subprojects {
tasks.register("clean", Delete) {
delete rootProject.buildDir
}
buildscript {
repositories {
google()
mavenCentral()
}
dependencies {
classpath "com.google.android.libraries.mapsplatform.secrets-gradle-plugin:secrets-gradle-plugin:2.0.1"
}
}

View File

@@ -0,0 +1,2 @@
# This file mirrors the state of secrets.properties as a reference for the developer.
# And as a fallback for build.gradle

View File

@@ -0,0 +1,2 @@
json_key_file("google-key.json") # Path to the json secret file - Follow https://docs.fastlane.tools/actions/supply/#setup to get one
package_name("com.anydev.anyway") # e.g. com.krausefx.app

View File

@@ -0,0 +1,52 @@
default_platform(:android)
platform :android do
desc "Deploy a new version to closed testing (play store)"
lane :deploy_testing do
build_name = ENV["BUILD_NAME"]
build_number = ENV["BUILD_NUMBER"]
sh(
"flutter",
"build",
"appbundle",
"--release",
"--build-name=#{build_name}",
"--build-number=#{build_number}",
)
upload_to_play_store(
track: 'alpha',
skip_upload_apk: true,
skip_upload_changelogs: true,
aab: "../build/app/outputs/bundle/release/app-release.aab",
# this is the default output of flutter build ... --release
# in particular this the build folder lies in the flutter root folder
# this is the parent folder for the android folder
)
end
desc "Deploy a new version as a full release"
lane :deploy_release do
build_name = ENV["BUILD_NAME"]
build_number = ENV["BUILD_NUMBER"]
sh(
"flutter",
"build",
"appbundle",
"--release",
"--build-name=#{build_name}",
"--build-number=#{build_number}",
)
upload_to_play_store(
track: 'production',
skip_upload_apk: true,
skip_upload_changelogs: true,
aab: "../build/app/outputs/bundle/release/app-release.aab",
)
end
end

View File

@@ -0,0 +1,40 @@
fastlane documentation
----
# Installation
Make sure you have the latest version of the Xcode command line tools installed:
```sh
xcode-select --install
```
For _fastlane_ installation instructions, see [Installing _fastlane_](https://docs.fastlane.tools/#installing-fastlane)
# Available Actions
## Android
### android deploy_testing
```sh
[bundle exec] fastlane android deploy_testing
```
Deploy a new version as a preview version
### android deploy_release
```sh
[bundle exec] fastlane android deploy_release
```
Deploy a new version as a full release
----
This README.md is auto-generated and will be re-generated every time [_fastlane_](https://fastlane.tools) is run.
More information about _fastlane_ can be found on [fastlane.tools](https://fastlane.tools).
The documentation of _fastlane_ can be found on [docs.fastlane.tools](https://docs.fastlane.tools).

View File

@@ -0,0 +1,7 @@
AnyWay - plan city trips your way
AnyWay is a mobile application that helps users plan city trips. The app allows users to specify their preferences and constraints, and then generates a personalized itinerary for them. The planning follows some guiding principles:
- **Personalization**:The user's preferences should be reflected in the choice of destinations.
- **Efficiency**:The itinerary should be optimized for the user's constraints.
- **Flexibility**: We aknowledge that tourism is a dynamic activity, and that users may want to change their plans on the go.
- **Discoverability**: Tourism is an inherently exploratory activity. Once a rough itinerary is generated, detours and spontaneous decisions should be encouraged.

Binary file not shown.

After

Width:  |  Height:  |  Size: 106 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 637 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 573 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 175 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 360 KiB

View File

@@ -0,0 +1 @@
AnyWay - plan city trips your way!

View File

@@ -0,0 +1 @@
AnyWay

View File

@@ -1,5 +0,0 @@
distributionBase=GRADLE_USER_HOME
distributionPath=wrapper/dists
zipStoreBase=GRADLE_USER_HOME
zipStorePath=wrapper/dists
distributionUrl=https\://services.gradle.org/distributions/gradle-7.6.3-all.zip

View File

@@ -20,7 +20,7 @@ pluginManagement {
plugins {
id "dev.flutter.flutter-plugin-loader" version "1.0.0"
id "com.android.application" version "7.3.0" apply false
id "org.jetbrains.kotlin.android" version "1.7.10" apply false
id "org.jetbrains.kotlin.android" version "2.0.20" apply false
}
include ":app"

107
frontend/assets/cat.svg Normal file
View File

@@ -0,0 +1,107 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 500 500" style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g id="OBJECTS">
<g>
<path style="fill:#F2DBDE;" d="M381.005,363.01c-53.963,8.445-84.441,11.1-138.832,6.101
c-54.388-4.998-109.48-25.844-144.743-67.555c-23.468-27.759-36.728-62.943-43.732-98.613c-3.745-19.07-5.754-39.21,0.433-57.635
c7.513-22.378,26.565-39.569,48.136-49.156c21.572-9.589,45.552-12.365,69.151-12.944c47.753-1.172,95.706,6.26,140.863,21.831
c35.603,12.277,69.954,29.937,96.972,56.171c27.019,26.233,46.213,61.723,47.963,99.341
C458.967,298.17,438.434,354.022,381.005,363.01z"/>
<g>
<path style="fill:#F2BFC6;" d="M314.479,248.209c-22.398,36.41-29.246,81.831-19.597,123.401
c27.302-0.242,52.026-3.263,86.124-8.6c57.429-8.989,77.961-64.84,76.211-102.458c-1.503-32.308-15.881-63.041-37.024-87.694
C375.546,184.337,337.241,211.21,314.479,248.209z"/>
<path style="fill:#F2BFC6;" d="M60.074,229.111c2.232,7.566,4.802,15.029,7.749,22.32c40.138-5.931,78.066-26.379,104.834-56.907
c26.459-30.176,41.716-69.876,42.677-109.969c-14.6-1.246-29.267-1.705-43.916-1.345c-11.908,0.292-23.911,1.147-35.655,3.151
C136.569,142.478,107.155,198.423,60.074,229.111z"/>
<path style="fill:#F2BFC6;" d="M365.131,128.557c-16.748-9.529-34.631-17.233-52.85-23.516
c-6.45-2.224-12.962-4.262-19.517-6.153c-1.712,23.304-4.543,46.555-11.914,68.659c-9.236,27.692-26.464,53.808-52.01,67.931
c-22.973,12.7-50.376,14.689-74.443,25.169c-21.624,9.417-39.587,25.305-54.36,43.893c8.346,9.381,17.714,17.663,27.862,24.902
c16.736-21.461,41.874-37.166,67.161-48.559c35.578-16.03,74.129-26.682,105.739-49.566
C334.357,207.023,357.577,169.22,365.131,128.557z"/>
</g>
</g>
<ellipse style="opacity:0.15;fill:#2D3038;" cx="250.223" cy="394.224" rx="109.236" ry="18.917"/>
<g>
<path style="fill:#2D3038;" d="M305.132,388.442c-0.168,1.158-0.626,2.243-1.458,3.061c-1.863,1.832-4.823,1.724-7.427,1.538
c-17.939-1.285-36.017-0.625-53.815,1.965c-7.053,3.155-16.423,3.233-25.275,2.004c-8.853-1.231-17.514-3.684-26.397-4.661
c-8.885-0.976-21.867-0.33-26.499,2.758c0,0-7.266,3.996-12.907,12.021c-3.367,4.789-4.105,11.306-2.377,16.899
c2.452,7.945,10.312,13.334,18.475,14.912c8.163,1.579,16.603-0.053,24.6-2.327c22.82-6.49,43.805-18.134,66.018-26.468
c22.213-8.334,47.017-13.282,69.546-5.844c3.96,1.306,7.879,3.033,10.941,5.866c3.062,2.832,5.173,6.927,4.813,11.081
c-0.464,5.356-4.97,9.719-10.061,11.444c-5.092,1.726-10.658,1.275-15.953,0.346c-5.296-0.93-10.554-2.17-15.926-2.414
c-20.08-0.909-38.455,4.247-56.124,10.857c-17.669,6.608-35.096,14.21-53.56,18.085c-18.463,3.874-35.807,8.106-51.682-4.186
c-20.345-15.753-19.603-41.137-8.091-63.296c5.521-10.629,12.589-18.637,19.416-27.732c-1.72-12.542-6.898-24.945-9.467-37.525
c-4.135-20.25-1.309-41.854,7.666-61.314c5.614-15.439,11.257-30.942,19.093-45.38c7.835-14.438,18.007-27.88,31.297-37.536
c13.289-9.656,29.927-15.279,46.993-13.222c7.787-8.403,16.038-16.377,24.703-23.871c-1.319-7.29-1.183-14.637,0.584-20.961
c-4.077-8.872-8.2-17.907-9.54-27.579c-0.835-6.027-0.441-12.408,1.577-17.991c1.878-5.198,8.452-6.799,12.542-3.08
c6.673,6.07,12.683,12.869,17.891,20.235c18.398-4.802,38.164-4.231,56.264,1.583c6.473-8.017,14.398-14.861,23.286-20.075
c2.366-1.388,5.533-2.613,7.657-0.875c1.683,1.377,1.736,3.89,1.592,6.059c-0.815,12.217-3.418,24.313-8.016,36.577
c4.862,15.779,0.82,33.862-9.812,46.412c-2.168,11.956,1.193,24.438,2.504,36.665c2.294,21.385-1.98,43.411-12.271,62.744
c-2.4,4.508-5.754,8.444-9.863,11.477c-1.71,1.263-3.38,2.581-5.006,3.951c-5.172,20.881-10.139,41.311-15.351,62.281
c2.061,7.78,4.487,15.496,7.272,23.126c3.209-0.899,6.478-1.696,9.816-1.809c3.896-0.132,7.942,0.744,11.024,3.131
c2.308,1.785,3.979,4.375,4.658,7.212c0.484,2.028,0.445,4.26-0.563,6.086c-0.663,1.203-1.81,2.171-3.102,2.583
c0.454,1.78,0.565,3.616,0.106,5.385c-0.778,3.004-3.622,5.6-6.675,5.375c-0.047,0.112-0.097,0.223-0.151,0.333
c-0.979,1.985-3.08,3.228-5.239,3.714c-2.063,0.464-4.207,0.333-6.319,0.174c-0.138,0.225-0.3,0.437-0.489,0.633
c-1.556,1.603-4.16,1.338-6.346,0.87c-3.015-0.645-6.04-1.471-8.688-3.051c-2.647-1.583-4.906-4.013-5.707-6.991
c-1.237-4.607,2.111-10.097,0.151-14.313c-3.538-7.609-7.733-14.893-12.004-22.126c-8.712,7.077-18.162,13.242-28.147,18.367
c6.95-0.974,14.248-1.345,21.476-0.293c3.273,0.475,6.596,1.283,9.285,3.208c2.689,1.924,4.631,5.173,4.214,8.453
c-0.34,2.664-2.596,5.054-5.156,5.449"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M151.465,379.089
c0.578-3.877,0.614-7.729,0.28-11.566"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M296.431,98.602
c1.739,2.591,3.381,5.247,4.918,7.962"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M273.736,153.553
c-0.645-1.929-1.188-3.891-1.625-5.865"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M295.23,106.811
c-4.87-7.904-10.55-15.309-16.923-22.061c-1.834-1.943-4.156-3.987-6.799-3.598c-2.928,0.431-4.574,3.626-5.147,6.53
c-1.629,8.254,1.474,16.627,4.521,24.47"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M352.846,98.327
c1.084,0.372,2.162,0.763,3.232,1.174"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M363.545,168.179
c-1.077,1.107-2.211,2.161-3.399,3.155"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M295.583,130.136
c3.86-4.907,10.772-7.181,16.791-5.521c6.019,1.659,10.791,7.151,11.446,13.054c-4.594,3.601-11.6,3.717-16.311,0.268
c-3.162-2.315-5.105-6.101-5.423-9.993"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M363.109,126.785
c-1.79-2.631-5.159-4.002-8.321-3.646c-3.162,0.356-6.042,2.317-7.787,4.979c-1.743,2.662-2.395,5.96-1.828,9.854
c4.738,1.952,10.727,0.164,13.621-4.066c1.462-2.137,2.057-4.785,1.832-7.36"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M350.957,171.048
c-4.278,4.378-10.749,6.497-16.787,5.499"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M338.68,282.717
c-5.42,4.867-10.31,10.327-14.541,16.258"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M333.834,368.351
c0.757,2.017,1.54,4.028,2.348,6.032c2.26-0.589,4.541-1.183,6.876-1.268c2.333-0.084,4.757,0.381,6.656,1.74
c1.559,1.116,2.664,2.753,3.552,4.452c0.261,0.499,0.505,1.013,0.727,1.536"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M317.138,283.315
c0.476,18.805,3.038,37.553,7.633,55.961"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M337.823,376.837
c2.877-0.595,5.878,0.99,7.67,3.316c1.791,2.327,2.567,5.273,3.025,8.174c0.191,1.214,0.327,2.48,0.209,3.695"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M327.236,380.633
c3.086-0.38,6.102,1.606,7.733,4.252c1.632,2.645,2.112,5.835,2.285,8.939c0.04,0.721,0.054,1.476-0.027,2.204"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M305.059,385.808
c-0.036-0.193-0.079-0.385-0.128-0.573c-1.058-4.111-4.728-7.422-8.927-8.052"/>
<g>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M250.442,264.812
c-1.67-3.125-3.183-6.325-4.488-9.622c-5.098-12.883-6.92-27.047-5.248-40.801"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M302.266,351.248
c-7.667-12.944-15.022-25.405-19.496-39.762"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M272.435,372.065
c-3.368,0.554-6.637,1.226-9.757,1.918c10.852-22.715,21.971-46.794,19.913-71.883c-0.826-10.055-4.036-20.316-11.156-27.463
c-8.522-8.553-21.576-11.406-33.547-9.827c-22.022,2.903-41.327,20.57-46.167,42.248"/>
</g>
<g>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M328.579,152.076
c1.379-0.341,2.796,0.501,3.736,1.565c0.942,1.065,1.588,2.366,2.551,3.41c0.963,1.044,2.43,1.826,3.784,1.398
c1.002-0.317,1.702-1.217,2.207-2.139c0.504-0.921,0.888-1.923,1.572-2.721c1.237-1.447,3.432-1.978,5.192-1.258"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M360.735,158.173
c-2.16,5.007-7.325,8.57-12.773,8.812c-1.946,0.086-3.967-0.245-5.593-1.317c-1.872-1.234-2.979-3.253-3.85-5.361
c-0.089,1.146-0.496,2.29-1.133,3.25c-1.229,1.854-3.175,3.116-5.189,4.059c-3.3,1.546-7.007,2.373-10.616,1.879
c-3.611-0.495-7.099-2.413-9.07-5.477"/>
<path style="fill:none;stroke:#FFFFFF;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:10;" d="M338.276,158.534
c0,0,0.176,1.073,0.244,1.773"/>
</g>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 9.5 KiB

273
frontend/assets/city.svg Normal file
View File

@@ -0,0 +1,273 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 1000 700" style="enable-background:new 0 0 1000 700;" xml:space="preserve">
<g id="Shadow">
<g style="opacity:0.1;">
<path style="fill:#38415C;" d="M186.919,556.734c0,0.331,0.541,0.599,1.208,0.599c0.667,0,1.208-0.268,1.208-0.599
c0-0.331-0.541-0.599-1.208-0.599C187.46,556.135,186.919,556.403,186.919,556.734z"/>
<path style="fill:#38415C;" d="M957.699,446.328h-12.196h-37.267h-8.131h-4.525h-22.106h-29.729h-8.01h-8.921h-7.462H777.69h-5.38
h-8.517h-13.921h-24.898h-0.367h-35.201h-33.29h-13.405h-8.642h-18.387h-20.084H584.19h-2.542h-5.421h-37.944h-7.453h-2.757
h-2.757h-1.428h-8.748h-5.514h-10.175h-0.905h-4.609h-10.175h-5.514h-10.175h-2.757h-2.757h-27.938h-8.05h-29.96h-6.713h-18.964
h-11.234h-48.644h-12.099h-10.229h-20.764h-12.382h-3.512h-23.242h-5.943h-13.266h-10.795h-35.413h-16.467h-4.656h-8.696h-25.877
H89.054h-4.763H72.026h-7.508H53.821H42.302v9.376h11.519v6.41h10.696v6.835h19.774v2.177h20.658v9.405h17.084v-5.919h11.557
v10.475h3.789v11.69h11.18v9.823h-4.017v1.763h7.066v4.785h23.433v28.254h-1.845v1.897h4.028v2.429h4.636v0.913v2.777v2.41h5.594
v4.306h0.673l0,0h0.673v-4.306h3.015l1.823-2.41h12.206v-3.69h31.948V543.3h4.028v-1.897h-1.845V484.37h15.302v40.617h1.509v3.023
h2.811v10.012h1.016v-10.012h2.287v7.997h1.017v-7.997h4.828v-3.023h6.098v-10.569h11.445v-0.743h-1.492v-2.116h7.56v32.974
h-1.078v0.849h7.678v5.101h-0.992v0.817h7.047v2.933h-1.099v0.627h3.502v2.77h3.348v5.513h0.402h0.398h0.314v-5.513h1.354v9.889
h0.402h0.399h0.314v-9.889h0.451h2.897v-2.77h3.034v-0.627h-0.632v-2.933h7.047v-0.817h-0.992v-5.101h7.678v-0.849h-1.078v-43.096
h23.505v-5.982h8.399v31.88h-4.954v0.806h4.954v1.443h6.279v2.37h30.344v27.318h7.165v21.803h13.871V593h1.952v-11.927h4.298
v9.528h1.952v-9.528h21.941v-12.964h3.982v-8.839h11.148v-32.999h4.318v-7.629l6.342,0.769v1.742h5.514v-1.073l10.175,1.234v1.969
h5.514v-1.3l9.332,1.131v9.523h2.491v16.539h11.982v6.297h6.46v5.29h2.267v9.068h0.586v-9.068h2.267v-5.29h6.46v-6.297h12.075
v-16.539h2.399v-45.67l5.467,13.925h5.729v12.219h-2.645v0.527h31.278v6.75h-3.52v0.763h-1.791v2.08h8.284v2.313h-1.087v0.668
h18.198v-0.668h-1.087v-2.313h23.966c0.802,1.935,2.023,3.811,3.668,5.596l-3.992,0.913c-0.688-0.732-2.184-1.239-3.92-1.239
c-2.388,0-4.324,0.96-4.324,2.143c0,1.183,1.936,2.143,4.324,2.143c2.388,0,4.324-0.96,4.324-2.143
c0-0.239-0.08-0.468-0.225-0.683l4.015-0.919c2.595,2.749,6.165,5.281,10.623,7.491c0.352,0.174,0.709,0.346,1.069,0.515
l-3.154,1.668c-0.76-0.329-1.753-0.528-2.841-0.528c-2.388,0-4.324,0.959-4.324,2.143s1.936,2.143,4.324,2.143
c2.388,0,4.324-0.959,4.324-2.143c0-0.559-0.432-1.068-1.139-1.449l3.16-1.671c5.36,2.471,11.576,4.337,18.308,5.527l-1.744,2.453
c-0.378-0.054-0.777-0.083-1.19-0.083c-2.388,0-4.324,0.96-4.324,2.143c0,1.183,1.936,2.143,4.324,2.143
c2.388,0,4.324-0.96,4.324-2.143c0-0.895-1.107-1.662-2.68-1.982l1.743-2.451c5.551,0.953,11.445,1.449,17.493,1.449
c0.498,0,0.995-0.003,1.491-0.01l0.198,3.017c-2.096,0.148-3.707,1.041-3.707,2.121c0,1.183,1.936,2.143,4.324,2.143
c2.388,0,4.324-0.96,4.324-2.143c0-1.184-1.936-2.143-4.324-2.143c-0.046,0-0.091,0.001-0.137,0.002l-0.197-3.004
c2.456-0.044,4.881-0.173,7.265-0.378l-2.223,24.735l79.948-8.225v-43.336h13.883v22.309h24.985v8.902h1.355v-8.902h2.795v16.446
h1.355v-16.446h3.219v11.855h1.355v-11.855h4.235v-54.059h12.874V506.6h2.033v3.715h2.033v2.582h14.483v-2.582h2.033V506.6h7.369
v1.594h3.557V506.6h1.259v4.262h5.082V506.6h1.452l3.161-11.593h11.528v5.526h0.762v-5.526h4.32v6.746h0.762v-6.746h6.25v-1.567
h-1.592v-17.317h12.874l1.507-4.997h10.931v-9.012h11.954v-6.86h12.196V446.328z M653.829,518.335l-11.117,0.179v-7.937h2.055
v-0.76h-2.055v-0.593l13.295,2.426c-1.417,1.94-2.19,4.031-2.19,6.21C653.816,518.019,653.821,518.177,653.829,518.335z
M689.289,499.58c-4.354,0.083-8.516,0.542-12.36,1.312l-5.314-6.414v-0.42h5.082v4.786h5.082v-4.786h7.148L689.289,499.58z
M702.329,517.554l-8.713,0.14c-0.026-0.114-0.079-0.224-0.155-0.328l9.073-2.076L702.329,517.554z M666.025,494.058v0.401
c-0.325,0.085-0.657,0.163-0.979,0.251l-0.713-0.651H666.025z M666.025,495.263v0.341l-0.291-0.266
C665.83,495.311,665.929,495.289,666.025,495.263z M666.025,496.603v2.241h2.454l3.554,3.247c-2.98,0.871-5.693,1.943-8.062,3.179
l-10.904-5.064c0.33-0.173,0.666-0.344,1.007-0.513c3.276-1.624,6.914-3.003,10.823-4.12L666.025,496.603z M672.377,502.405
l15.07,13.768l-22.95-10.659C666.813,504.306,669.465,503.258,672.377,502.405z M669.572,498.844h2.043v-3.739l4.87,5.877
c-1.242,0.259-2.449,0.55-3.618,0.872L669.572,498.844z M691.664,494.058v4.786h12.332l-1.224,1.721
c-3.776-0.648-7.828-1.001-12.044-1.001c-0.32,0-0.64,0.002-0.959,0.006l-0.361-5.512H691.664z M703.939,499.641l-0.101,1.127
c-0.206-0.039-0.404-0.087-0.612-0.124L703.939,499.641z M702.896,511.25l-8.307,4.394l8.619-7.874L702.896,511.25z
M702.863,511.616l-0.306,3.407l-9.299,2.127c-0.053-0.046-0.11-0.09-0.172-0.133l0.598-0.547L702.863,511.616z M693.364,518.468
l8.74,1.595l-0.252,2.801l-8.846-4.108C693.147,518.667,693.268,518.571,693.364,518.468z M693.53,518.245
c0.056-0.1,0.091-0.205,0.102-0.312l8.676-0.14l-0.182,2.021L693.53,518.245z M656.116,486.551l-11.415-10.428h11.415V486.551z
M656.116,487.55v4.746h-1.779v1.763h8.903l0.969,0.885c-4.029,1.15-7.778,2.571-11.154,4.243
c-0.352,0.175-0.698,0.352-1.039,0.53l-3.311-1.538c0.63-0.372,1.01-0.852,1.01-1.376c0-1.184-1.936-2.143-4.324-2.143
c-1.018,0-1.941,0.182-2.68,0.473v-5.035h2.055v-0.76h-2.055v-9.479h2.055v-0.76h-2.055v-2.977h0.897L656.116,487.55z
M642.711,500.338h2.055v-0.76h-2.055v-1.104c0.739,0.292,1.662,0.473,2.68,0.473c1.158,0,2.209-0.226,2.985-0.593l3.31,1.537
c-3.677,1.959-6.684,4.15-8.975,6.503V500.338z M642.711,508.163c2.337-2.844,5.703-5.479,10.027-7.784l10.906,5.065
c-3.215,1.722-5.771,3.749-7.47,5.983l-13.463-2.457V508.163z M664.17,505.688l24.004,11.148l0.198,0.181
c-0.107,0.074-0.2,0.152-0.279,0.235l-31.242-5.701C658.515,509.362,661.02,507.375,664.17,505.688z M673.21,502.168
c1.146-0.316,2.33-0.602,3.548-0.855l12.647,15.264c-0.111,0.028-0.218,0.06-0.32,0.094L673.21,502.168z M677.203,501.222
c3.766-0.755,7.844-1.204,12.11-1.286l1.082,16.492c-0.187,0.011-0.369,0.03-0.544,0.057L677.203,501.222z M689.793,499.928
c0.311-0.004,0.623-0.006,0.935-0.006c4.131,0,8.103,0.346,11.804,0.981l-11.067,15.563c-0.19-0.025-0.388-0.04-0.591-0.045
L689.793,499.928z M702.985,500.982c0.278,0.05,0.543,0.112,0.818,0.165l-0.497,5.535l-10.935,9.99
c-0.142-0.048-0.294-0.09-0.453-0.126L702.985,500.982z M692.678,518.929l9.146,4.247l-0.629,7.002l-9.143-11.035
C692.279,519.085,692.49,519.013,692.678,518.929z M656.683,511.774l31.244,5.702c-0.056,0.1-0.091,0.204-0.102,0.312
l-33.276,0.536c-0.008-0.154-0.012-0.309-0.012-0.464C654.537,515.724,655.295,513.675,656.683,511.774z M687.841,518.026
c0.026,0.114,0.079,0.224,0.155,0.328l-30.225,6.916c-1.892-2.059-3.018-4.325-3.204-6.708L687.841,518.026z M688.199,518.57
c0.106,0.092,0.231,0.178,0.373,0.257l-22.753,12.035c-3.243-1.527-5.915-3.348-7.845-5.376L688.199,518.57z M688.923,518.989
c0.188,0.074,0.394,0.137,0.616,0.186l-11.067,15.563c-4.604-0.824-8.776-2.098-12.301-3.714L688.923,518.989z M689.992,519.254
c0.19,0.024,0.388,0.04,0.591,0.045l1.082,16.493c-0.311,0.004-0.623,0.006-0.936,0.006c-4.131,0-8.102-0.346-11.804-0.98
L689.992,519.254z M691.063,519.291c0.187-0.011,0.369-0.03,0.544-0.057l9.537,11.51l-0.39,4.342
c-2.753,0.394-5.632,0.641-8.61,0.697L691.063,519.291z M640.035,523.229h7.987v-2.08h-1.791v-0.763h-3.52v-1.635l11.136-0.179
c0.189,2.432,1.339,4.745,3.27,6.846l-13.578,3.106C641.982,526.835,640.816,525.059,640.035,523.229z M654.074,536.027
c-4.336-2.149-7.809-4.612-10.333-7.285l13.579-3.107c1.969,2.07,4.697,3.929,8.007,5.487l-10.218,5.404
C654.761,536.362,654.415,536.196,654.074,536.027z M655.459,536.689l10.219-5.405c3.597,1.65,7.855,2.95,12.553,3.791
l-4.97,6.989C666.716,540.907,660.672,539.092,655.459,536.689z M690.728,543.552c-5.882,0-11.614-0.483-17.013-1.409l4.97-6.989
c3.776,0.648,7.828,1.001,12.043,1.001c0.321,0,0.64-0.002,0.959-0.006l0.485,7.393
C691.692,543.549,691.211,543.552,690.728,543.552z M692.653,543.535l-0.485-7.395c2.956-0.057,5.813-0.299,8.553-0.681
l-0.69,7.679C697.611,543.354,695.148,543.49,692.653,543.535z"/>
</g>
</g>
<g id="Object">
<g style="opacity:0.3;">
<linearGradient id="SVGID_1_" gradientUnits="userSpaceOnUse" x1="207.5072" y1="393.376" x2="207.5072" y2="229.7061">
<stop offset="0" style="stop-color:#403E40"/>
<stop offset="0.1275" style="stop-color:#4E4D4E"/>
<stop offset="0.3124" style="stop-color:#5A5A5A"/>
<stop offset="0.5479" style="stop-color:#626262"/>
<stop offset="1" style="stop-color:#646464"/>
</linearGradient>
<polygon style="fill:url(#SVGID_1_);" points="175.04,393.376 239.974,393.376 239.974,259.43 241.819,259.43 241.819,255.601
237.792,255.601 237.792,250.701 205.844,250.701 205.844,243.255 193.638,243.255 191.815,238.393 188.799,238.393
188.799,229.706 188.126,229.706 187.454,229.706 187.454,238.393 181.859,238.393 181.859,243.255 181.859,248.859
181.859,250.701 177.223,250.701 177.223,255.601 173.195,255.601 173.195,259.43 175.04,259.43 "/>
<linearGradient id="SVGID_00000000931258187104496080000017865145222397382034_" gradientUnits="userSpaceOnUse" x1="188.1266" y1="229.7061" x2="188.1266" y2="227.2891">
<stop offset="0" style="stop-color:#403E40"/>
<stop offset="0.1275" style="stop-color:#4E4D4E"/>
<stop offset="0.3124" style="stop-color:#5A5A5A"/>
<stop offset="0.5479" style="stop-color:#626262"/>
<stop offset="1" style="stop-color:#646464"/>
</linearGradient>
<path style="fill:url(#SVGID_00000000931258187104496080000017865145222397382034_);" d="M189.335,228.498
c0-0.668-0.541-1.209-1.209-1.209c-0.667,0-1.208,0.541-1.208,1.209c0,0.667,0.541,1.208,1.208,1.208
C188.794,229.706,189.335,229.165,189.335,228.498z"/>
<linearGradient id="SVGID_00000036247364810958532620000001993945857249512106_" gradientUnits="userSpaceOnUse" x1="508.9194" y1="421.4165" x2="508.9194" y2="155.3276">
<stop offset="0" style="stop-color:#403E40"/>
<stop offset="0.1275" style="stop-color:#4E4D4E"/>
<stop offset="0.3124" style="stop-color:#5A5A5A"/>
<stop offset="0.5479" style="stop-color:#626262"/>
<stop offset="1" style="stop-color:#646464"/>
</linearGradient>
<polygon style="fill:url(#SVGID_00000036247364810958532620000001993945857249512106_);" points="777.689,401.218 777.689,221.14
697.741,204.545 706.501,401.218 471.904,401.218 471.904,289.958 467.586,289.958 467.586,223.38 456.438,223.38
456.438,205.547 452.456,205.547 452.456,179.391 430.514,179.391 430.514,160.168 428.562,160.168 428.562,179.391
424.264,179.391 424.264,155.328 422.311,155.328 422.311,179.391 408.44,179.391 408.44,223.38 401.275,223.38 401.275,289.958
395.133,289.958 395.133,401.218 332.748,401.218 332.748,253.114 333.825,253.114 333.825,251.402 326.147,251.402
326.147,241.111 327.14,241.111 327.14,239.463 320.092,239.463 320.092,233.547 320.725,233.547 320.725,232.282 317.69,232.282
317.69,226.693 314.794,226.693 314.343,226.693 314.343,206.742 314.029,206.742 313.63,206.742 313.228,206.742
313.228,226.693 311.874,226.693 311.874,215.571 311.56,215.571 311.161,215.571 310.759,215.571 310.759,226.693
307.411,226.693 307.411,232.282 303.909,232.282 303.909,233.547 305.009,233.547 305.009,239.463 297.962,239.463
297.962,241.111 298.954,241.111 298.954,251.402 291.276,251.402 291.276,253.114 292.354,253.114 292.354,401.218
84.29,401.218 84.29,421.417 933.548,421.417 933.548,401.218 "/>
</g>
<linearGradient id="SVGID_00000121963338060960119620000016097684000583641491_" gradientUnits="userSpaceOnUse" x1="499.6613" y1="451.2495" x2="499.6613" y2="202.0752">
<stop offset="0.0815" style="stop-color:#403E40"/>
<stop offset="0.4715" style="stop-color:#444244"/>
<stop offset="0.8768" style="stop-color:#504F50"/>
<stop offset="1" style="stop-color:#555455"/>
</linearGradient>
<path style="fill:url(#SVGID_00000121963338060960119620000016097684000583641491_);" d="M918.278,419.4v-18.183h-25.56
l-9.674-71.571h-1.452v-8.598h-5.082v8.598h-1.259v-3.216h-3.557v3.216h-7.369v-7.496h-2.033v-5.209h-14.483v5.209h-2.033v7.496
h-2.033v42.986h-12.874V263.564h-4.235v-23.92h-1.355v23.92h-3.219v-33.181h-1.355v33.181h-2.795v-17.961h-1.355v17.961h-24.985
v77.418h-27.78v50.154h-25.944l-20.601-37.972c3.473-2,6.738-4.405,9.735-7.193l4.225,4.508c-0.907,0.793-1.481,1.957-1.481,3.256
c0,2.388,1.936,4.324,4.324,4.324c2.388,0,4.324-1.936,4.324-4.324c0-2.388-1.936-4.324-4.324-4.324
c-0.916,0-1.764,0.285-2.463,0.771l-4.255-4.54c0.36-0.341,0.717-0.687,1.069-1.039c4.458-4.459,8.028-9.568,10.623-15.114
l4.705,2.172c-0.086,0.339-0.131,0.693-0.131,1.059c0,2.388,1.936,4.324,4.324,4.324c2.388,0,4.324-1.936,4.324-4.324
c0-2.388-1.936-4.324-4.324-4.324c-1.852,0-3.432,1.165-4.048,2.803l-4.648-2.146c2.866-6.273,4.489-13.092,4.744-20.153
l5.065,0.165c0.062,2.334,1.972,4.207,4.321,4.207c2.388,0,4.324-1.936,4.324-4.324c0-2.388-1.936-4.324-4.324-4.324
c-2.266,0-4.123,1.743-4.308,3.961l-5.064-0.165c0.013-0.496,0.021-0.993,0.021-1.491c0-6.303-1.088-12.438-3.173-18.192
l4.231-1.558c0.664,1.533,2.191,2.606,3.968,2.606c2.388,0,4.324-1.936,4.324-4.324c0-2.388-1.936-4.324-4.324-4.324
c-2.388,0-4.324,1.936-4.324,4.324c0,0.44,0.066,0.866,0.189,1.267l-4.229,1.557c-2.41-6.464-6.085-12.437-10.898-17.611
l3.31-3.102c0.776,0.741,1.827,1.197,2.985,1.197c2.388,0,4.324-1.935,4.324-4.324c0-2.388-1.936-4.324-4.324-4.324
c-2.388,0-4.324,1.936-4.324,4.324c0,1.057,0.38,2.025,1.01,2.776l-3.31,3.102c-0.341-0.36-0.687-0.717-1.039-1.069
c-5.012-5.013-10.848-8.903-17.201-11.546l1.821-4.434c0.413,0.131,0.853,0.203,1.309,0.203c2.388,0,4.324-1.936,4.324-4.324
c0-2.388-1.936-4.324-4.324-4.324c-2.388,0-4.324,1.936-4.324,4.324c0,1.762,1.054,3.276,2.566,3.95l-1.816,4.423
c-5.685-2.304-11.775-3.615-18.057-3.842l0.197-6.06c0.046,0.001,0.091,0.003,0.137,0.003c2.388,0,4.324-1.936,4.324-4.324
c0-2.388-1.936-4.324-4.324-4.324c-2.388,0-4.324,1.936-4.324,4.324c0,2.179,1.611,3.98,3.707,4.279l-0.198,6.086
c-0.496-0.014-0.993-0.021-1.491-0.021c-6.048,0-11.942,1.001-17.493,2.925l-1.743-4.946c1.573-0.647,2.68-2.193,2.68-4
c0-2.388-1.936-4.324-4.324-4.324c-2.388,0-4.324,1.936-4.324,4.324c0,2.388,1.936,4.324,4.324,4.324
c0.413,0,0.812-0.059,1.19-0.167l1.744,4.948c-6.732,2.401-12.948,6.166-18.308,11.152l-3.16-3.372
c0.707-0.77,1.139-1.796,1.139-2.923c0-2.388-1.936-4.324-4.324-4.324c-2.388,0-4.324,1.935-4.324,4.324
c0,2.388,1.936,4.324,4.324,4.324c1.088,0,2.081-0.402,2.841-1.065l3.154,3.366c-0.36,0.341-0.717,0.688-1.069,1.04
c-4.458,4.458-8.028,9.568-10.623,15.114l-4.015-1.854c0.146-0.433,0.225-0.896,0.225-1.377c0-2.388-1.936-4.324-4.324-4.324
c-2.388,0-4.324,1.936-4.324,4.324c0,2.388,1.936,4.324,4.324,4.324c1.736,0,3.232-1.023,3.92-2.5l3.992,1.843
c-2.865,6.273-4.489,13.093-4.744,20.153l-5.154-0.167c-0.064-2.333-1.973-4.204-4.321-4.204c-2.388,0-4.324,1.936-4.324,4.324
c0,2.388,1.936,4.324,4.324,4.324c2.267,0,4.125-1.744,4.308-3.963l5.153,0.167c-0.013,0.496-0.021,0.993-0.021,1.491
c0,6.302,1.088,12.438,3.173,18.191l-4.231,1.558c-0.664-1.533-2.191-2.606-3.969-2.606c-2.388,0-4.324,1.936-4.324,4.324
c0,2.388,1.936,4.324,4.324,4.324c2.388,0,4.324-1.936,4.324-4.324c0-0.44-0.066-0.866-0.189-1.266l4.229-1.557
c2.409,6.464,6.084,12.436,10.898,17.611l-3.31,3.102c-0.776-0.741-1.827-1.197-2.985-1.197c-2.388,0-4.324,1.935-4.324,4.324
c0,2.388,1.936,4.324,4.324,4.324c2.388,0,4.324-1.936,4.324-4.324c0-1.057-0.38-2.025-1.01-2.776l3.311-3.102
c0.341,0.36,0.687,0.717,1.039,1.069c3.376,3.375,7.125,6.242,11.154,8.561l-20.601,37.972h-16.685v-25.743h-6.801v-48.882h2.645
v-1.063H564.32v1.063h2.645v24.652h-5.729l-5.467,28.096v-92.143h-2.399v-33.37h-12.075v-12.705h-6.46V220.37h-2.267v-18.294
h-0.586v18.294h-2.267v10.672h-6.46v12.705h-11.982v33.37h-2.491V337.7h-18.579v53.436h-61.639V287.814h4.954v-1.626h-4.954v-2.911
h-6.279v-4.781h-51.354v4.781h-6.279v2.911h-4.954v1.626h4.954v64.319h-8.4v-12.069h-51.819v14.737h-11.298v27.442h-18.294V292.55
h-6.098v-6.099h-4.828v-16.134h-1.017v16.134h-2.287v-20.2h-1.016v20.2h-2.811v6.099h-1.509v89.693h-37.859v-52.597h1.779v-3.557
h-11.688v-9.655h-5.59v9.655h-5.082v-9.655h-5.082v9.655h-9.885v-9.655h-30.261v9.655h-7.066v3.557h4.017v19.819h-11.18v44.72
h-15.346v-11.942h-17.084v26.044H84.29V419.4H53.82v31.849h891.682V419.4H918.278z M716.559,351.896l-7.135-13.152
c2.287-1.349,4.417-2.938,6.354-4.731l10.219,10.906C723.091,347.622,719.925,349.955,716.559,351.896z M689.85,309.7
c0.175,0.055,0.357,0.094,0.544,0.116l-1.082,33.274c-4.266-0.165-8.344-1.071-12.11-2.594L689.85,309.7z M676.758,340.314
c-1.218-0.512-2.402-1.089-3.548-1.726l15.875-29.261c0.102,0.07,0.209,0.133,0.32,0.19L676.758,340.314z M690.874,309.832
c0.203-0.01,0.401-0.041,0.591-0.091l11.067,31.4c-3.701,1.281-7.673,1.978-11.804,1.978c-0.313,0-0.625-0.004-0.936-0.012
L690.874,309.832z M691.917,309.581c0.159-0.072,0.311-0.157,0.453-0.254l15.875,29.261c-1.677,0.932-3.435,1.734-5.261,2.394
L691.917,309.581z M694.589,311.399l20.697,22.088c-1.893,1.751-3.973,3.303-6.206,4.623L694.589,311.399z M693.683,309.731
l-0.598-1.103c0.062-0.086,0.119-0.176,0.172-0.268l30.224,13.952c-1.93,4.091-4.602,7.766-7.844,10.847L693.683,309.731z
M693.46,307.925c0.077-0.21,0.129-0.432,0.156-0.662l33.274,1.082c-0.186,4.809-1.313,9.38-3.204,13.534L693.46,307.925z
M693.631,306.783c-0.011-0.217-0.046-0.428-0.102-0.63l31.244-11.503c1.388,3.836,2.146,7.971,2.146,12.279
c0,0.313-0.004,0.624-0.012,0.936L693.631,306.783z M693.364,305.702c-0.097-0.207-0.217-0.401-0.358-0.579l24.281-22.752
c3.15,3.404,5.654,7.411,7.319,11.828L693.364,305.702z M692.678,304.772c-0.188-0.17-0.399-0.315-0.627-0.433l12.647-30.797
c4.661,1.958,8.83,4.864,12.261,8.477L692.678,304.772z M691.606,304.157c-0.175-0.055-0.357-0.094-0.544-0.116l1.082-33.273
c4.265,0.164,8.344,1.071,12.11,2.594L691.606,304.157z M690.582,304.025c-0.203,0.01-0.401,0.041-0.591,0.09l-11.067-31.4
c3.701-1.281,7.672-1.978,11.804-1.978c0.313,0,0.625,0.004,0.936,0.012L690.582,304.025z M689.539,304.276
c-0.221,0.099-0.428,0.226-0.616,0.375L666.17,280.37c3.525-3.261,7.697-5.832,12.301-7.494L689.539,304.276z M688.572,304.978
c-0.143,0.158-0.268,0.332-0.373,0.518l-30.224-13.953c1.929-4.091,4.602-7.766,7.845-10.847L688.572,304.978z M687.996,305.932
c-0.077,0.211-0.129,0.432-0.156,0.662l-33.274-1.082c0.186-4.809,1.313-9.38,3.204-13.534L687.996,305.932z M687.825,307.075
c0.011,0.217,0.046,0.427,0.102,0.629l-31.244,11.503c-1.388-3.836-2.146-7.97-2.146-12.279c0-0.313,0.004-0.625,0.012-0.936
L687.825,307.075z M688.092,308.155c0.078,0.168,0.171,0.326,0.279,0.474l-0.198,0.365l-24.004,22.492
c-3.15-3.404-5.654-7.411-7.319-11.828L688.092,308.155z M687.446,310.332l-15.07,27.777c-2.912-1.72-5.564-3.835-7.88-6.272
L687.446,310.332z M672.866,339.221c1.169,0.649,2.376,1.238,3.618,1.759l-5.681,13.833c-1.732-0.721-3.424-1.537-5.07-2.445
L672.866,339.221z M676.929,341.163c3.843,1.555,8.006,2.479,12.36,2.647l-0.485,14.92c-6.107-0.221-12.028-1.496-17.555-3.735
L676.929,341.163z M689.769,343.828c0.319,0.008,0.638,0.013,0.959,0.013c4.215,0,8.267-0.712,12.043-2.02l4.97,14.101
c-5.399,1.87-11.131,2.844-17.013,2.844c-0.482,0-0.964-0.007-1.444-0.021L689.769,343.828z M703.225,341.661
c1.862-0.672,3.655-1.49,5.365-2.44l7.133,13.148c-2.417,1.333-4.933,2.468-7.528,3.394L703.225,341.661z M727.382,343.582
c-0.341,0.341-0.686,0.676-1.035,1.007l-10.217-10.904c3.31-3.144,6.038-6.895,8.007-11.07l13.579,6.269
C735.191,334.277,731.718,339.247,727.382,343.582z M737.917,328.448l-13.578-6.268c1.931-4.239,3.081-8.904,3.27-13.813
l14.921,0.485C742.281,315.718,740.702,322.349,737.917,328.448z M742.565,306.929c0,0.482-0.007,0.963-0.02,1.444l-14.917-0.485
c0.008-0.319,0.012-0.639,0.012-0.959c0-4.397-0.774-8.615-2.19-12.528l14.03-5.165
C741.507,294.832,742.565,300.799,742.565,306.929z M728.718,271.66c4.68,5.032,8.253,10.839,10.596,17.124l-14.032,5.167
c-1.699-4.508-4.255-8.598-7.47-12.072L728.718,271.66z M727.382,270.274c0.341,0.341,0.676,0.686,1.007,1.035l-10.904,10.217
c-3.502-3.687-7.756-6.653-12.513-8.65l5.681-13.833C716.831,261.615,722.507,265.399,727.382,270.274z M692.652,255.126
c6.107,0.221,12.028,1.496,17.555,3.735l-5.681,13.833c-3.843-1.555-8.006-2.479-12.36-2.647L692.652,255.126z M690.728,255.092
c0.482,0,0.964,0.007,1.444,0.02l-0.485,14.917c-0.319-0.008-0.638-0.012-0.959-0.012c-4.215,0-8.267,0.712-12.043,2.019
l-4.97-14.101C679.114,256.065,684.846,255.092,690.728,255.092z M673.261,258.094l4.97,14.102
c-4.698,1.696-8.956,4.319-12.553,7.648l-10.219-10.906C660.671,264.091,666.716,260.43,673.261,258.094z M654.074,270.274
c0.341-0.341,0.687-0.676,1.035-1.007l10.218,10.904c-3.31,3.144-6.038,6.895-8.007,11.071l-13.579-6.269
C646.265,279.58,649.738,274.61,654.074,270.274z M643.539,285.409l13.578,6.268c-1.931,4.239-3.081,8.905-3.27,13.813
l-14.921-0.485C639.175,298.14,640.754,291.509,643.539,285.409z M638.891,306.929c0-0.482,0.007-0.964,0.02-1.444l14.917,0.485
c-0.008,0.318-0.012,0.638-0.012,0.959c0,4.396,0.774,8.614,2.19,12.528l-14.03,5.166
C639.949,319.025,638.891,313.058,638.891,306.929z M652.738,342.197c-4.68-5.032-8.253-10.839-10.597-17.124l14.032-5.167
c1.699,4.508,4.255,8.598,7.47,12.072L652.738,342.197z M654.074,343.582c-0.341-0.341-0.676-0.686-1.007-1.035l10.904-10.217
c2.368,2.494,5.082,4.656,8.062,6.414l-7.135,13.152C660.988,349.642,657.35,346.859,654.074,343.582z M665.046,353.636
c1.692,0.934,3.431,1.771,5.21,2.512l-1.821,4.434c-0.413-0.131-0.853-0.202-1.309-0.202c-2.388,0-4.324,1.936-4.324,4.324
c0,2.388,1.936,4.324,4.324,4.324c2.388,0,4.324-1.936,4.324-4.324c0-1.762-1.054-3.276-2.566-3.95l1.816-4.423
c5.685,2.304,11.775,3.615,18.057,3.842l-0.148,4.534c-2.341,0.053-4.224,1.967-4.224,4.321c0,2.388,1.936,4.324,4.324,4.324
c2.388,0,4.324-1.936,4.324-4.324c0-2.26-1.734-4.113-3.944-4.306l0.148-4.535c0.496,0.014,0.993,0.021,1.491,0.021
c6.048,0,11.942-1.001,17.493-2.925l1.554,4.408c-1.471,0.69-2.491,2.184-2.491,3.916c0,2.388,1.936,4.324,4.324,4.324
c2.388,0,4.324-1.936,4.324-4.324c0-2.388-1.936-4.324-4.324-4.324c-0.485,0-0.951,0.081-1.387,0.229l-1.547-4.388
c2.667-0.952,5.252-2.117,7.736-3.487l20.345,37.5h-92.055L665.046,353.636z"/>
<g>
<linearGradient id="SVGID_00000121273610027325662480000007068999652675512506_" gradientUnits="userSpaceOnUse" x1="815.83" y1="285.1626" x2="815.83" y2="287.5796">
<stop offset="0" style="stop-color:#403E40"/>
<stop offset="1" style="stop-color:#161F21"/>
</linearGradient>
<path style="fill:url(#SVGID_00000121273610027325662480000007068999652675512506_);" d="M816.844,286.371
c0-0.667-0.454-1.208-1.014-1.208c-0.56,0-1.014,0.541-1.014,1.208c0,0.668,0.454,1.208,1.014,1.208
C816.39,287.58,816.844,287.039,816.844,286.371z"/>
<linearGradient id="SVGID_00000011738580747612097720000010840228285618223286_" gradientUnits="userSpaceOnUse" x1="500" y1="287.5796" x2="500" y2="451.2495">
<stop offset="0" style="stop-color:#403E40"/>
<stop offset="1" style="stop-color:#161F21"/>
</linearGradient>
<polygon style="fill:url(#SVGID_00000011738580747612097720000010840228285618223286_);" points="927.404,433.241 921.11,391.136
908.236,391.136 908.236,356.197 909.828,356.197 909.828,353.036 903.578,353.036 903.578,339.427 902.815,339.427
902.815,353.036 898.496,353.036 898.496,341.887 897.734,341.887 897.734,353.036 871.695,353.036 871.695,356.197
873.474,356.197 873.474,427.088 843.745,427.088 843.745,395.334 826.815,395.334 826.815,317.303 828.363,317.303
828.363,313.475 824.982,313.475 824.982,308.574 821.091,308.574 821.091,306.732 821.091,301.129 821.091,296.267
816.395,296.267 816.395,287.58 815.83,287.58 815.265,287.58 815.265,296.267 812.734,296.267 811.204,301.129 800.958,301.129
800.958,308.574 774.141,308.574 774.141,313.475 770.76,313.475 770.76,317.303 772.309,317.303 772.309,368.073
749.871,368.073 749.871,417.957 724.973,406.925 724.973,358.507 728.991,358.507 728.991,354.95 721.924,354.95
721.924,345.294 691.664,345.294 691.664,354.95 681.778,354.95 681.778,345.294 676.696,345.294 676.696,354.95 671.615,354.95
671.615,345.294 666.025,345.294 666.025,354.95 654.337,354.95 654.337,358.507 656.115,358.507 656.115,425.617
644.765,425.617 644.765,424.913 642.711,424.913 642.711,405.789 644.765,405.789 644.765,404.255 642.711,404.255
642.711,385.13 644.765,385.13 644.765,383.597 642.711,383.597 642.711,364.472 644.765,364.472 644.765,362.939
642.711,362.939 642.711,343.814 644.765,343.814 644.765,342.28 642.711,342.28 642.711,323.156 644.765,323.156
644.765,321.622 642.711,321.622 642.711,301.83 646.231,301.83 646.231,300.291 648.021,300.291 648.021,296.095
614.595,296.095 614.595,291.429 615.682,291.429 615.682,290.081 597.484,290.081 597.484,291.429 598.571,291.429
598.571,296.095 590.287,296.095 590.287,300.291 592.078,300.291 592.078,301.83 595.598,301.83 595.598,321.622
593.543,321.622 593.543,323.156 595.598,323.156 595.598,342.28 593.543,342.28 593.543,343.814 595.598,343.814
595.598,362.939 593.543,362.939 593.543,364.472 595.598,364.472 595.598,383.597 593.543,383.597 593.543,385.13
595.598,385.13 595.598,404.255 593.543,404.255 593.543,405.789 595.598,405.789 595.598,424.913 593.543,424.913
593.543,426.447 595.598,426.447 595.598,442.075 584.189,442.075 584.189,381.685 538.283,381.685 538.283,425.617
530.83,425.617 530.83,288.763 525.315,288.763 525.315,292.286 515.14,294.775 515.14,292.487 509.625,292.487 509.625,296.124
499.45,298.612 499.45,295.989 493.936,295.989 493.936,299.961 483.76,302.45 483.76,300.286 478.246,300.286 478.246,303.799
468.071,306.288 468.071,304.423 462.557,304.423 462.557,438.816 454.799,438.816 454.799,367.168 426.065,367.168
426.065,411.227 396.608,411.227 396.608,373.655 392.979,373.655 392.979,361.316 395.133,361.316 395.133,360.077
385.601,360.077 385.601,356.197 384.584,356.197 384.584,360.077 381.535,360.077 381.535,340.162 380.773,340.162
380.773,360.077 376.802,360.077 376.802,361.316 379.138,361.316 379.138,373.655 359.697,373.655 359.697,402.316
340.771,402.316 330.886,438.816 311.053,438.816 311.053,319.642 284.794,319.642 284.794,315.373 286.286,315.373
286.286,313.875 266.397,313.875 266.397,315.373 267.961,315.373 267.961,319.642 267.961,326.508 267.961,408.127
255.579,408.127 255.579,377.289 259.98,377.289 259.98,374.497 228.825,374.497 228.825,355.867 222.882,355.867
222.882,352.039 209.616,352.039 209.616,355.867 198.821,355.867 198.821,422.389 163.408,422.389 163.408,373.052
133.589,373.052 133.589,412.021 107.712,412.021 107.712,436.954 89.054,436.954 89.054,405.609 64.517,405.609 64.517,432.333
42.301,432.333 42.301,451.25 64.517,451.25 72.025,451.25 84.29,451.25 89.054,451.25 107.712,451.25 133.589,451.25
142.285,451.25 146.941,451.25 163.408,451.25 198.821,451.25 209.616,451.25 222.882,451.25 228.825,451.25 252.067,451.25
255.579,451.25 267.961,451.25 288.725,451.25 298.954,451.25 311.053,451.25 359.697,451.25 370.931,451.25 389.895,451.25
396.608,451.25 426.568,451.25 434.618,451.25 462.557,451.25 465.314,451.25 468.071,451.25 478.246,451.25 483.76,451.25
493.936,451.25 498.545,451.25 499.45,451.25 509.625,451.25 515.14,451.25 523.887,451.25 525.315,451.25 528.072,451.25
530.83,451.25 538.283,451.25 576.227,451.25 581.647,451.25 584.189,451.25 595.598,451.25 615.682,451.25 634.069,451.25
642.711,451.25 656.115,451.25 689.405,451.25 724.606,451.25 724.973,451.25 749.871,451.25 763.792,451.25 772.309,451.25
777.689,451.25 819.353,451.25 826.815,451.25 835.736,451.25 843.745,451.25 873.474,451.25 895.58,451.25 900.105,451.25
908.236,451.25 957.698,451.25 957.698,433.241 "/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 28 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

161
frontend/assets/plan.svg Normal file
View File

@@ -0,0 +1,161 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px"
viewBox="0 0 2200 2200" style="enable-background:new 0 0 2200 2200;" xml:space="preserve">
<g id="Objects">
<g>
<path style="fill:#788D8E;" d="M1202.178,2002.073c-5.328,0-9.648-4.319-9.649-9.647c-0.001-5.328,4.319-9.649,9.647-9.649
c9.63-0.001,19.271-0.006,28.918-0.014c0.003,0,0.006,0,0.009,0c5.325,0,9.643,4.314,9.647,9.639
c0.005,5.328-4.311,9.651-9.639,9.656C1221.458,2002.068,1211.813,2002.072,1202.178,2002.073z M1144.298,2002.03
c-0.006,0-0.01,0-0.016,0c-9.658-0.015-19.305-0.036-28.94-0.061c-5.328-0.014-9.636-4.345-9.622-9.673
c0.014-5.319,4.331-9.622,9.648-9.622c0.008,0,0.017,0,0.025,0c9.628,0.025,19.269,0.046,28.919,0.061
c5.328,0.009,9.641,4.335,9.632,9.663C1153.937,1997.721,1149.619,2002.03,1144.298,2002.03z M1288.979,2001.966
c-5.317,0-9.634-4.306-9.647-9.626c-0.012-5.328,4.298-9.657,9.626-9.669c9.634-0.022,19.274-0.047,28.923-0.075
c5.297,0.018,9.66,4.292,9.676,9.619c0.015,5.328-4.291,9.66-9.619,9.676c-9.652,0.028-19.299,0.054-28.936,0.075
C1288.994,2001.966,1288.986,2001.966,1288.979,2001.966z M1057.498,2001.759c-0.015,0-0.03,0-0.045,0
c-9.659-0.044-19.306-0.095-28.939-0.152c-5.328-0.031-9.622-4.376-9.591-9.704c0.031-5.309,4.344-9.591,9.646-9.591
c0.02,0,0.04,0,0.058,0c9.625,0.057,19.263,0.107,28.914,0.152c5.328,0.025,9.628,4.364,9.603,9.692
C1067.12,1997.468,1062.805,2001.759,1057.498,2001.759z M1375.787,2001.691c-5.31,0-9.625-4.293-9.647-9.609
c-0.022-5.328,4.281-9.665,9.609-9.686c9.636-0.039,19.28-0.079,28.928-0.123c0.015,0,0.029,0,0.044,0
c5.308,0,9.623,4.291,9.647,9.604c0.024,5.328-4.276,9.666-9.604,9.691c-9.651,0.043-19.297,0.084-28.937,0.122
C1375.813,2001.691,1375.8,2001.691,1375.787,2001.691z M1462.601,2001.29c-5.305,0-9.619-4.287-9.647-9.598
c-0.027-5.328,4.27-9.67,9.598-9.698c9.641-0.05,19.285-0.102,28.932-0.156c0.018,0,0.037,0,0.056,0
c5.302,0,9.616,4.284,9.646,9.594c0.029,5.328-4.266,9.671-9.594,9.701c-9.649,0.054-19.297,0.105-28.94,0.156
C1462.635,2001.29,1462.618,2001.29,1462.601,2001.29z M970.707,2001.198c-0.027,0-0.054,0-0.081,0
c-9.662-0.079-19.308-0.166-28.939-0.259c-5.328-0.052-9.605-4.413-9.554-9.741c0.052-5.296,4.361-9.554,9.646-9.554
c0.032,0,0.063,0,0.095,0.001c9.621,0.093,19.258,0.179,28.911,0.258c5.328,0.044,9.612,4.399,9.568,9.727
C980.31,1996.93,975.998,2001.198,970.707,2001.198z M1549.42,2000.802c-5.301,0-9.614-4.281-9.646-9.59
c-0.032-5.328,4.262-9.673,9.59-9.705l28.936-0.176c0.021,0,0.042,0,0.061,0c5.301,0,9.614,4.279,9.647,9.587
c0.033,5.328-4.259,9.674-9.587,9.708l-28.942,0.176C1549.459,2000.802,1549.439,2000.802,1549.42,2000.802z M883.923,2000.297
c-0.041,0-0.083,0-0.124-0.001c-9.663-0.122-19.308-0.25-28.935-0.385c-5.328-0.075-9.586-4.454-9.511-9.782
c0.075-5.328,4.464-9.604,9.782-9.511c9.617,0.136,19.254,0.264,28.907,0.385c5.328,0.067,9.592,4.44,9.525,9.768
C893.501,1996.057,889.195,2000.297,883.923,2000.297z M1636.244,2000.264c-5.3,0-9.612-4.279-9.646-9.586
c-0.034-5.328,4.258-9.675,9.586-9.709l28.943-0.184c0.021,0,0.042,0,0.062,0c5.3,0,9.613,4.279,9.647,9.586
c0.034,5.328-4.259,9.675-9.586,9.709l-28.943,0.184C1636.286,2000.264,1636.265,2000.264,1636.244,2000.264z M1723.075,1999.719
c-5.3,0-9.614-4.28-9.647-9.588c-0.033-5.328,4.26-9.674,9.588-9.707l28.946-0.177c0.02,0,0.04,0,0.059,0
c5.301,0,9.615,4.28,9.647,9.589c0.032,5.328-4.261,9.674-9.589,9.706l-28.945,0.177
C1723.115,1999.719,1723.095,1999.719,1723.075,1999.719z M1809.912,1999.201c-5.303,0-9.616-4.283-9.647-9.593
c-0.03-5.328,4.265-9.672,9.593-9.702l28.952-0.16c0.018,0,0.036,0,0.053,0c5.304,0,9.618,4.285,9.647,9.596
c0.028,5.328-4.268,9.67-9.596,9.699l-28.947,0.16C1809.949,1999.201,1809.931,1999.201,1809.912,1999.201z M797.15,1998.997
c-0.057,0-0.115-0.001-0.172-0.002c-9.665-0.169-19.31-0.346-28.935-0.531c-5.327-0.103-9.562-4.504-9.459-9.832
c0.104-5.327,4.48-9.533,9.832-9.459c9.612,0.186,19.245,0.363,28.899,0.531c5.327,0.092,9.57,4.487,9.477,9.814
C806.701,1994.788,802.399,1998.997,797.15,1998.997z M1896.756,1998.75c-5.307,0-9.621-4.29-9.647-9.602
c-0.025-5.328,4.274-9.667,9.602-9.693c9.658-0.045,19.31-0.089,28.958-0.129c0.014,0,0.027,0,0.041,0
c5.309,0,9.624,4.292,9.647,9.607c0.023,5.328-4.279,9.666-9.607,9.688c-9.644,0.041-19.294,0.084-28.948,0.129
C1896.787,1998.75,1896.771,1998.75,1896.756,1998.75z M710.388,1997.237c-0.076,0-0.152-0.001-0.228-0.003
c-9.667-0.223-19.311-0.457-28.932-0.7c-5.326-0.135-9.535-4.562-9.401-9.889c0.133-5.243,4.425-9.404,9.64-9.404
c0.083,0,0.166,0.001,0.249,0.003c9.607,0.243,19.237,0.477,28.891,0.7c5.327,0.122,9.545,4.541,9.421,9.868
C719.907,1993.064,715.612,1997.237,710.388,1997.237z M623.642,1994.948c-0.096,0-0.193-0.001-0.29-0.004
c-9.67-0.287-19.314-0.585-28.929-0.894c-5.326-0.172-9.503-4.628-9.333-9.953c0.171-5.325,4.65-9.465,9.953-9.333
c9.6,0.309,19.227,0.607,28.88,0.892c5.326,0.158,9.516,4.604,9.357,9.93C633.126,1990.815,628.838,1994.948,623.642,1994.948z
M536.937,1991.683c-0.226,0-0.453-0.007-0.682-0.024c-10.955-0.765-20.624-1.8-29.559-3.167
c-5.267-0.806-8.883-5.728-8.078-10.995c0.805-5.268,5.725-8.888,10.995-8.078c8.408,1.286,17.563,2.264,27.985,2.992
c5.316,0.371,9.323,4.981,8.952,10.296C546.196,1987.794,541.959,1991.683,536.937,1991.683z M452.936,1972.044
c-1.444,0-2.91-0.325-4.291-1.012c-5.129-2.553-10.134-5.398-14.875-8.458c-3.821-2.466-7.597-5.17-11.224-8.036
c-4.179-3.305-4.889-9.372-1.585-13.552c3.305-4.18,9.371-4.889,13.552-1.585c3.144,2.486,6.415,4.828,9.719,6.961
c4.142,2.673,8.519,5.162,13.01,7.397c4.77,2.373,6.713,8.165,4.34,12.935C459.893,1970.083,456.481,1972.044,452.936,1972.044z
M390.843,1913.094c-3.257,0-6.436-1.65-8.252-4.636c-5.088-8.366-9.753-17.406-13.866-26.869
c-2.125-4.886,0.115-10.57,5.002-12.693c4.884-2.125,10.569,0.114,12.694,5.002c3.765,8.661,8.023,16.915,12.657,24.534
c2.768,4.552,1.323,10.487-3.23,13.256C394.28,1912.639,392.55,1913.094,390.843,1913.094z M360.798,1832.103
c-4.545,0-8.593-3.227-9.469-7.857c-1.832-9.681-3.208-19.642-4.092-29.605c-0.471-5.307,3.45-9.991,8.757-10.463
c5.305-0.466,9.991,3.449,10.462,8.757c0.828,9.335,2.117,18.663,3.831,27.724c0.99,5.236-2.451,10.282-7.686,11.272
C361.996,1832.047,361.393,1832.103,360.798,1832.103z M357.3,1745.636c-0.337,0-0.677-0.018-1.02-0.054
c-5.299-0.557-9.143-5.304-8.586-10.603c1.051-10.004,2.612-19.95,4.636-29.561c1.098-5.214,6.217-8.55,11.429-7.451
c5.214,1.099,8.55,6.215,7.451,11.429c-1.889,8.965-3.345,18.252-4.327,27.6C366.362,1741.952,362.175,1745.636,357.3,1745.636z
M379.592,1662.099c-1.292,0-2.604-0.261-3.863-0.812c-4.881-2.136-7.107-7.824-4.97-12.706
c3.967-9.067,8.429-18.064,13.26-26.742c2.591-4.655,8.466-6.329,13.122-3.737c4.656,2.592,6.329,8.466,3.737,13.122
c-4.533,8.143-8.72,16.585-12.442,25.091C386.85,1659.939,383.308,1662.099,379.592,1662.099z M425.149,1588.527
c-2.174,0-4.36-0.73-6.162-2.228c-4.097-3.407-4.658-9.489-1.252-13.587c6.291-7.567,13.033-14.977,20.039-22.024
c3.756-3.78,9.865-3.796,13.644-0.039c3.778,3.756,3.796,9.865,0.039,13.643c-6.604,6.642-12.958,13.624-18.884,20.754
C430.666,1587.342,427.917,1588.527,425.149,1588.527z M488.757,1529.81c-3.021,0-5.994-1.414-7.875-4.063
c-3.085-4.345-2.062-10.367,2.282-13.452c8.036-5.705,16.424-11.148,24.935-16.182c4.587-2.711,10.502-1.194,13.215,3.393
c2.712,4.586,1.194,10.503-3.393,13.215c-8.051,4.762-15.987,9.912-23.588,15.308
C492.639,1529.232,490.688,1529.81,488.757,1529.81z M565.026,1488.849c-3.858,0-7.5-2.33-8.989-6.14
c-1.939-4.962,0.513-10.558,5.476-12.497c8.996-3.515,18.386-6.814,27.91-9.805c5.085-1.593,10.499,1.23,12.094,6.314
c1.597,5.083-1.229,10.498-6.313,12.095c-9.107,2.86-18.081,6.012-26.67,9.368
C567.381,1488.636,566.194,1488.849,565.026,1488.849z M648.419,1465.236c-4.529,0-8.569-3.204-9.462-7.816
c-1.012-5.232,2.408-10.293,7.639-11.306c9.243-1.788,18.981-3.446,28.945-4.926c5.268-0.786,10.177,2.855,10.961,8.125
c0.783,5.27-2.855,10.178-8.125,10.961c-9.686,1.439-19.145,3.049-28.114,4.784
C649.644,1465.178,649.027,1465.236,648.419,1465.236z M734.448,1453.776c-4.949,0-9.161-3.786-9.599-8.809
c-0.464-5.308,3.463-9.987,8.771-10.45c8.927-0.779,18.419-1.521,29.019-2.265c5.285-0.369,9.926,3.633,10.3,8.948
c0.373,5.315-3.633,9.926-8.948,10.299c-10.489,0.737-19.875,1.469-28.691,2.239
C735.013,1453.764,734.729,1453.776,734.448,1453.776z M821.074,1447.866c-5.059,0-9.308-3.941-9.621-9.059
c-0.325-5.318,3.722-9.893,9.041-10.219l28.889-1.767c5.341-0.318,9.893,3.723,10.219,9.041c0.325,5.318-3.722,9.893-9.041,10.219
l-28.889,1.766C821.471,1447.86,821.272,1447.866,821.074,1447.866z M907.74,1442.568c-5.059,0-9.308-3.941-9.621-9.059
c-0.325-5.318,3.722-9.893,9.041-10.219l28.889-1.766c5.344-0.326,9.893,3.723,10.219,9.041c0.325,5.318-3.722,9.893-9.041,10.218
l-28.889,1.767C908.137,1442.562,907.938,1442.568,907.74,1442.568z M994.406,1437.269c-5.059,0-9.307-3.941-9.62-9.059
c-0.325-5.318,3.722-9.893,9.041-10.219l28.889-1.766c5.345-0.309,9.893,3.722,10.219,9.041c0.325,5.318-3.722,9.893-9.041,10.219
l-28.889,1.767C994.805,1437.263,994.604,1437.269,994.406,1437.269z M1081.072,1431.969c-5.059,0-9.307-3.941-9.62-9.059
c-0.325-5.318,3.722-9.893,9.041-10.219l28.889-1.766c5.322-0.318,9.894,3.723,10.219,9.041c0.325,5.318-3.722,9.893-9.041,10.219
l-28.889,1.767C1081.47,1431.964,1081.27,1431.969,1081.072,1431.969z M1167.738,1426.671c-5.059,0-9.307-3.941-9.62-9.059
c-0.325-5.318,3.722-9.893,9.041-10.219l28.889-1.767c5.349-0.326,9.894,3.723,10.219,9.041c0.325,5.318-3.722,9.893-9.041,10.219
l-28.889,1.766C1168.135,1426.665,1167.936,1426.671,1167.738,1426.671z M1254.358,1420.693c-4.914,0-9.115-3.738-9.592-8.73
c-0.508-5.304,3.38-10.015,8.685-10.523c10.205-0.975,19.481-2.043,28.357-3.264c5.276-0.72,10.146,2.966,10.872,8.244
c0.725,5.279-2.966,10.146-8.244,10.872c-9.142,1.257-18.676,2.354-29.148,3.356
C1254.976,1420.678,1254.665,1420.693,1254.358,1420.693z M1339.835,1406.05c-4.233,0-8.115-2.807-9.295-7.086
c-1.416-5.136,1.6-10.448,6.735-11.864c9.242-2.549,18.223-5.456,26.694-8.639c4.994-1.875,10.551,0.65,12.425,5.636
c1.875,4.988-0.649,10.55-5.636,12.425c-9.014,3.388-18.553,6.476-28.353,9.178
C1341.548,1405.937,1340.684,1406.05,1339.835,1406.05z M1418.99,1371.212c-3.131,0-6.201-1.522-8.057-4.329
c-2.938-4.445-1.716-10.43,2.729-13.368c7.542-4.985,14.948-10.622,22.014-16.754c4.022-3.493,10.115-3.063,13.609,0.962
c3.493,4.024,3.062,10.117-0.962,13.609c-7.698,6.683-15.781,12.832-24.022,18.28
C1422.663,1370.694,1420.817,1371.212,1418.99,1371.212z M1480.743,1310.868c-1.931,0-3.882-0.578-5.577-1.782
c-4.344-3.086-5.366-9.108-2.281-13.452c5.366-7.558,10.318-15.487,14.721-23.567c2.549-4.679,8.41-6.406,13.087-3.856
c4.679,2.549,6.406,8.409,3.856,13.087c-4.765,8.747-10.126,17.328-15.932,25.506
C1486.737,1309.454,1483.762,1310.868,1480.743,1310.868z M1517.216,1232.602c-0.778,0-1.568-0.094-2.356-0.292
c-5.169-1.297-8.306-6.538-7.009-11.707c2.256-8.984,3.898-18.121,4.878-27.155c0.575-5.299,5.342-9.13,10.632-8.55
c5.298,0.575,9.126,5.335,8.55,10.633c-1.076,9.911-2.875,19.928-5.346,29.771
C1525.467,1229.681,1521.535,1232.602,1517.216,1232.602z M1520.169,1146.498c-4.554,0-8.606-3.239-9.472-7.879
c-1.672-8.957-4.032-17.905-7.015-26.593c-1.731-5.04,0.953-10.527,5.992-12.257c5.036-1.729,10.528,0.953,12.257,5.992
c3.287,9.575,5.889,19.438,7.733,29.318c0.978,5.237-2.475,10.276-7.713,11.254
C1521.352,1146.444,1520.757,1146.498,1520.169,1146.498z M1486.631,1067.128c-3.053,0-6.055-1.445-7.93-4.142
c-5.223-7.514-11.025-14.791-17.243-21.63c-3.585-3.942-3.295-10.044,0.647-13.629c3.942-3.584,10.044-3.295,13.628,0.648
c6.781,7.457,13.11,15.396,18.811,23.597c3.041,4.375,1.961,10.387-2.415,13.428
C1490.452,1066.568,1488.532,1067.128,1486.631,1067.128z M1424.593,1007.152c-1.794,0-3.608-0.499-5.227-1.546
c-7.728-4.995-15.75-9.441-23.845-13.217l-0.314-0.146c-4.827-2.257-6.911-7.999-4.653-12.826c2.257-4.826,8-6.91,12.825-4.653
l0.276,0.129c8.915,4.158,17.716,9.036,26.183,14.508c4.475,2.892,5.758,8.864,2.866,13.339
C1430.859,1005.596,1427.759,1007.152,1424.593,1007.152z M1344.533,974.803c-0.7,0-1.41-0.076-2.123-0.236
c-8.759-1.966-18.077-3.643-27.693-4.985c-5.277-0.736-8.958-5.611-8.222-10.888c0.737-5.277,5.615-8.957,10.888-8.222
c10.137,1.415,19.98,3.187,29.254,5.268c5.199,1.167,8.467,6.327,7.3,11.527C1352.93,971.754,1348.947,974.803,1344.533,974.803z
M274.324,966.773c-5.318,0-9.634-4.305-9.647-9.625c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.016,0,0.023,0
c5.317,0,9.634,4.305,9.647,9.625c0.012,5.328-4.297,9.657-9.625,9.67l-28.943,0.066
C274.339,966.773,274.331,966.773,274.324,966.773z M361.152,966.573c-5.318,0-9.635-4.305-9.647-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c5.338-0.002,9.657,4.297,9.67,9.625s-4.297,9.657-9.625,9.67l-28.943,0.066
C361.167,966.573,361.16,966.573,361.152,966.573z M447.979,966.373c-5.318,0-9.634-4.305-9.647-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.942-0.066c0.008,0,0.015,0,0.023,0c5.318,0,9.635,4.305,9.648,9.625
c0.012,5.328-4.297,9.657-9.626,9.67l-28.942,0.066C447.995,966.373,447.987,966.373,447.979,966.373z M534.808,966.174
c-5.318,0-9.635-4.305-9.648-9.625c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.015,0,0.023,0
c5.318,0,9.635,4.305,9.648,9.625c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066
C534.823,966.174,534.815,966.174,534.808,966.174z M621.636,965.973c-5.318,0-9.635-4.305-9.648-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.015,0,0.023,0c5.318,0,9.635,4.305,9.648,9.625
c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066C621.651,965.973,621.643,965.973,621.636,965.973z M708.463,965.774
c-5.318,0-9.635-4.305-9.648-9.625c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.015,0,0.023,0
c5.318,0,9.635,4.305,9.648,9.625c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066
C708.478,965.774,708.471,965.774,708.463,965.774z M795.291,965.574c-5.318,0-9.635-4.305-9.648-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c5.292,0.018,9.658,4.297,9.67,9.625c0.012,5.328-4.297,9.657-9.626,9.67
l-28.943,0.066C795.306,965.574,795.299,965.574,795.291,965.574z M882.12,965.374c-5.318,0-9.635-4.305-9.648-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.016,0,0.024,0c5.317,0,9.634,4.305,9.647,9.625
c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066C882.135,965.374,882.127,965.374,882.12,965.374z M968.947,965.174
c-5.318,0-9.635-4.305-9.648-9.625c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.015,0,0.023,0
c5.318,0,9.635,4.305,9.648,9.625c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066
C968.962,965.174,968.955,965.174,968.947,965.174z M1258.376,965.13c-0.103,0-0.204-0.001-0.307-0.005
c-8.629-0.27-17.745-0.424-28.69-0.483c-5.328-0.029-9.623-4.372-9.595-9.7c0.029-5.31,4.342-9.595,9.647-9.595
c0.018,0,0.036,0,0.054,0c11.114,0.06,20.389,0.216,29.188,0.492c5.326,0.167,9.508,4.619,9.341,9.945
C1267.849,961.007,1263.564,965.13,1258.376,965.13z M1055.775,964.974c-5.318,0-9.635-4.305-9.648-9.625
c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.008,0,0.016,0,0.024,0c5.318,0,9.634,4.305,9.647,9.625
c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066C1055.79,964.974,1055.783,964.974,1055.775,964.974z M1142.603,964.775
c-5.318,0-9.635-4.305-9.648-9.625c-0.012-5.328,4.297-9.657,9.625-9.67l28.943-0.066c0.007,0,0.015,0,0.023,0
c5.318,0,9.635,4.305,9.648,9.625c0.012,5.328-4.297,9.657-9.626,9.67l-28.943,0.066
C1142.619,964.775,1142.611,964.775,1142.603,964.775z"/>
<path style="fill:#D13737;" d="M462.452,197.927c-156.647,0-283.638,126.991-283.638,283.638
c0,251.801,283.638,464.048,283.638,464.048S746.09,733.366,746.09,481.565C746.09,324.918,619.099,197.927,462.452,197.927z
M462.316,679.485c-109.374,0-198.045-88.671-198.045-198.055c0-109.374,88.671-198.045,198.045-198.045
c109.384,0,198.055,88.671,198.055,198.045C660.371,590.814,571.701,679.485,462.316,679.485z"/>
<path style="fill:#18ACB7;" d="M1737.548,1228.212c-156.647,0-283.638,126.991-283.638,283.638
c0,251.801,283.638,464.048,283.638,464.048s283.638-212.246,283.638-464.048
C2021.187,1355.203,1894.196,1228.212,1737.548,1228.212z M1737.413,1709.77c-109.374,0-198.045-88.671-198.045-198.055
c0-109.374,88.671-198.045,198.045-198.045c109.384,0,198.055,88.671,198.055,198.045
C1935.468,1621.1,1846.797,1709.77,1737.413,1709.77z"/>
</g>
</g>
</svg>

After

Width:  |  Height:  |  Size: 16 KiB

View File

@@ -0,0 +1,3 @@
description: This file stores settings for Dart & Flutter DevTools.
documentation: https://docs.flutter.dev/tools/devtools/extensions#configure-extension-enablement-states
extensions:

View File

@@ -0,0 +1,17 @@
flutter_launcher_icons:
image_path: "assets/launcher/icon.png"
# Android section
android: true
min_sdk_android: 21 # android min sdk min:16, default 21
# iOS section
ios: true
remove_alpha_ios: true # app store rejecs icons with alpha
# Web section
web:
generate: true
image_path: "assets/launcher/icon.png"
# background_color: "#hexcode"
# theme_color: "#hexcode"

View File

@@ -1,3 +1,9 @@
# fastlane secret
.env
secret.env
*.mobileprovision
report.xml
**/dgph
*.mode1v3
*.mode2v3

View File

@@ -1 +1,2 @@
#include? "Pods/Target Support Files/Pods-Runner/Pods-Runner.debug.xcconfig"
#include "Generated.xcconfig"

View File

@@ -1 +1,2 @@
#include? "Pods/Target Support Files/Pods-Runner/Pods-Runner.release.xcconfig"
#include "Generated.xcconfig"

5
frontend/ios/Gemfile Normal file
View File

@@ -0,0 +1,5 @@
source "https://rubygems.org"
gem "fastlane"
gem "cocoapods"

288
frontend/ios/Gemfile.lock Normal file
View File

@@ -0,0 +1,288 @@
GEM
remote: https://rubygems.org/
specs:
CFPropertyList (3.0.7)
base64
nkf
rexml
activesupport (5.2.8.1)
concurrent-ruby (~> 1.0, >= 1.0.2)
i18n (>= 0.7, < 2)
minitest (~> 5.1)
tzinfo (~> 1.1)
addressable (2.8.7)
public_suffix (>= 2.0.2, < 7.0)
algoliasearch (1.27.5)
httpclient (~> 2.8, >= 2.8.3)
json (>= 1.5.1)
artifactory (3.0.17)
atomos (0.1.3)
aws-eventstream (1.3.0)
aws-partitions (1.1004.0)
aws-sdk-core (3.212.0)
aws-eventstream (~> 1, >= 1.3.0)
aws-partitions (~> 1, >= 1.992.0)
aws-sigv4 (~> 1.9)
jmespath (~> 1, >= 1.6.1)
aws-sdk-kms (1.95.0)
aws-sdk-core (~> 3, >= 3.210.0)
aws-sigv4 (~> 1.5)
aws-sdk-s3 (1.170.1)
aws-sdk-core (~> 3, >= 3.210.0)
aws-sdk-kms (~> 1)
aws-sigv4 (~> 1.5)
aws-sigv4 (1.10.1)
aws-eventstream (~> 1, >= 1.0.2)
babosa (1.0.4)
base64 (0.2.0)
claide (1.1.0)
cocoapods (1.10.2)
addressable (~> 2.6)
claide (>= 1.0.2, < 2.0)
cocoapods-core (= 1.10.2)
cocoapods-deintegrate (>= 1.0.3, < 2.0)
cocoapods-downloader (>= 1.4.0, < 2.0)
cocoapods-plugins (>= 1.0.0, < 2.0)
cocoapods-search (>= 1.0.0, < 2.0)
cocoapods-trunk (>= 1.4.0, < 2.0)
cocoapods-try (>= 1.1.0, < 2.0)
colored2 (~> 3.1)
escape (~> 0.0.4)
fourflusher (>= 2.3.0, < 3.0)
gh_inspector (~> 1.0)
molinillo (~> 0.6.6)
nap (~> 1.0)
ruby-macho (~> 1.4)
xcodeproj (>= 1.19.0, < 2.0)
cocoapods-core (1.10.2)
activesupport (> 5.0, < 6)
addressable (~> 2.6)
algoliasearch (~> 1.0)
concurrent-ruby (~> 1.1)
fuzzy_match (~> 2.0.4)
nap (~> 1.0)
netrc (~> 0.11)
public_suffix
typhoeus (~> 1.0)
cocoapods-deintegrate (1.0.5)
cocoapods-downloader (1.6.3)
cocoapods-plugins (1.0.0)
nap
cocoapods-search (1.0.1)
cocoapods-trunk (1.6.0)
nap (>= 0.8, < 2.0)
netrc (~> 0.11)
cocoapods-try (1.2.0)
colored (1.2)
colored2 (3.1.2)
commander (4.6.0)
highline (~> 2.0.0)
concurrent-ruby (1.3.4)
declarative (0.0.20)
digest-crc (0.6.5)
rake (>= 12.0.0, < 14.0.0)
domain_name (0.6.20240107)
dotenv (2.8.1)
emoji_regex (3.2.3)
escape (0.0.4)
ethon (0.16.0)
ffi (>= 1.15.0)
excon (0.112.0)
faraday (1.10.4)
faraday-em_http (~> 1.0)
faraday-em_synchrony (~> 1.0)
faraday-excon (~> 1.1)
faraday-httpclient (~> 1.0)
faraday-multipart (~> 1.0)
faraday-net_http (~> 1.0)
faraday-net_http_persistent (~> 1.0)
faraday-patron (~> 1.0)
faraday-rack (~> 1.0)
faraday-retry (~> 1.0)
ruby2_keywords (>= 0.0.4)
faraday-cookie_jar (0.0.7)
faraday (>= 0.8.0)
http-cookie (~> 1.0.0)
faraday-em_http (1.0.0)
faraday-em_synchrony (1.0.0)
faraday-excon (1.1.0)
faraday-httpclient (1.0.1)
faraday-multipart (1.0.4)
multipart-post (~> 2)
faraday-net_http (1.0.2)
faraday-net_http_persistent (1.2.0)
faraday-patron (1.0.0)
faraday-rack (1.0.0)
faraday-retry (1.0.3)
faraday_middleware (1.2.1)
faraday (~> 1.0)
fastimage (2.3.1)
fastlane (2.225.0)
CFPropertyList (>= 2.3, < 4.0.0)
addressable (>= 2.8, < 3.0.0)
artifactory (~> 3.0)
aws-sdk-s3 (~> 1.0)
babosa (>= 1.0.3, < 2.0.0)
bundler (>= 1.12.0, < 3.0.0)
colored (~> 1.2)
commander (~> 4.6)
dotenv (>= 2.1.1, < 3.0.0)
emoji_regex (>= 0.1, < 4.0)
excon (>= 0.71.0, < 1.0.0)
faraday (~> 1.0)
faraday-cookie_jar (~> 0.0.6)
faraday_middleware (~> 1.0)
fastimage (>= 2.1.0, < 3.0.0)
fastlane-sirp (>= 1.0.0)
gh_inspector (>= 1.1.2, < 2.0.0)
google-apis-androidpublisher_v3 (~> 0.3)
google-apis-playcustomapp_v1 (~> 0.1)
google-cloud-env (>= 1.6.0, < 2.0.0)
google-cloud-storage (~> 1.31)
highline (~> 2.0)
http-cookie (~> 1.0.5)
json (< 3.0.0)
jwt (>= 2.1.0, < 3)
mini_magick (>= 4.9.4, < 5.0.0)
multipart-post (>= 2.0.0, < 3.0.0)
naturally (~> 2.2)
optparse (>= 0.1.1, < 1.0.0)
plist (>= 3.1.0, < 4.0.0)
rubyzip (>= 2.0.0, < 3.0.0)
security (= 0.1.5)
simctl (~> 1.6.3)
terminal-notifier (>= 2.0.0, < 3.0.0)
terminal-table (~> 3)
tty-screen (>= 0.6.3, < 1.0.0)
tty-spinner (>= 0.8.0, < 1.0.0)
word_wrap (~> 1.0.0)
xcodeproj (>= 1.13.0, < 2.0.0)
xcpretty (~> 0.3.0)
xcpretty-travis-formatter (>= 0.0.3, < 2.0.0)
fastlane-sirp (1.0.0)
sysrandom (~> 1.0)
ffi (1.17.0)
ffi (1.17.0-x86_64-darwin)
fourflusher (2.3.1)
fuzzy_match (2.0.4)
gh_inspector (1.1.3)
google-apis-androidpublisher_v3 (0.54.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-core (0.11.3)
addressable (~> 2.5, >= 2.5.1)
googleauth (>= 0.16.2, < 2.a)
httpclient (>= 2.8.1, < 3.a)
mini_mime (~> 1.0)
representable (~> 3.0)
retriable (>= 2.0, < 4.a)
rexml
google-apis-iamcredentials_v1 (0.17.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-playcustomapp_v1 (0.13.0)
google-apis-core (>= 0.11.0, < 2.a)
google-apis-storage_v1 (0.31.0)
google-apis-core (>= 0.11.0, < 2.a)
google-cloud-core (1.7.1)
google-cloud-env (>= 1.0, < 3.a)
google-cloud-errors (~> 1.0)
google-cloud-env (1.6.0)
faraday (>= 0.17.3, < 3.0)
google-cloud-errors (1.4.0)
google-cloud-storage (1.47.0)
addressable (~> 2.8)
digest-crc (~> 0.4)
google-apis-iamcredentials_v1 (~> 0.1)
google-apis-storage_v1 (~> 0.31.0)
google-cloud-core (~> 1.6)
googleauth (>= 0.16.2, < 2.a)
mini_mime (~> 1.0)
googleauth (1.8.1)
faraday (>= 0.17.3, < 3.a)
jwt (>= 1.4, < 3.0)
multi_json (~> 1.11)
os (>= 0.9, < 2.0)
signet (>= 0.16, < 2.a)
highline (2.0.3)
http-cookie (1.0.7)
domain_name (~> 0.5)
httpclient (2.8.3)
i18n (1.14.6)
concurrent-ruby (~> 1.0)
jmespath (1.6.2)
json (2.8.1)
jwt (2.9.3)
base64
mini_magick (4.13.2)
mini_mime (1.1.5)
minitest (5.25.1)
molinillo (0.6.6)
multi_json (1.15.0)
multipart-post (2.4.1)
nanaimo (0.4.0)
nap (1.1.0)
naturally (2.2.1)
netrc (0.11.0)
nkf (0.2.0)
optparse (0.6.0)
os (1.1.4)
plist (3.7.1)
public_suffix (6.0.1)
rake (13.2.1)
representable (3.2.0)
declarative (< 0.1.0)
trailblazer-option (>= 0.1.1, < 0.2.0)
uber (< 0.2.0)
retriable (3.1.2)
rexml (3.3.9)
rouge (2.0.7)
ruby-macho (1.4.0)
ruby2_keywords (0.0.5)
rubyzip (2.3.2)
security (0.1.5)
signet (0.19.0)
addressable (~> 2.8)
faraday (>= 0.17.5, < 3.a)
jwt (>= 1.5, < 3.0)
multi_json (~> 1.10)
simctl (1.6.10)
CFPropertyList
naturally
sysrandom (1.0.5)
terminal-notifier (2.0.0)
terminal-table (3.0.2)
unicode-display_width (>= 1.1.1, < 3)
thread_safe (0.3.6)
trailblazer-option (0.1.2)
tty-cursor (0.7.1)
tty-screen (0.8.2)
tty-spinner (0.9.3)
tty-cursor (~> 0.7)
typhoeus (1.4.1)
ethon (>= 0.9.0)
tzinfo (1.2.11)
thread_safe (~> 0.1)
uber (0.1.0)
unicode-display_width (2.6.0)
word_wrap (1.0.0)
xcodeproj (1.27.0)
CFPropertyList (>= 2.3.3, < 4.0)
atomos (~> 0.1.3)
claide (>= 1.0.2, < 2.0)
colored2 (~> 3.1)
nanaimo (~> 0.4.0)
rexml (>= 3.3.6, < 4.0)
xcpretty (0.3.0)
rouge (~> 2.0.7)
xcpretty-travis-formatter (1.0.1)
xcpretty (~> 0.2, >= 0.0.7)
PLATFORMS
ruby
x86_64-darwin-23
DEPENDENCIES
cocoapods
fastlane
BUNDLED WITH
2.5.23

59
frontend/ios/Podfile Normal file
View File

@@ -0,0 +1,59 @@
# Uncomment this line to define a global platform for your project
# platform :ios, '12.0'
# CocoaPods analytics sends network stats synchronously affecting flutter build latency.
ENV['COCOAPODS_DISABLE_STATS'] = 'true'
project 'Runner', {
'Debug' => :debug,
'Profile' => :release,
'Release' => :release,
}
def flutter_root
generated_xcode_build_settings_path = File.expand_path(File.join('..', 'Flutter', 'Generated.xcconfig'), __FILE__)
unless File.exist?(generated_xcode_build_settings_path)
raise "#{generated_xcode_build_settings_path} must exist. If you're running pod install manually, make sure flutter pub get is executed first"
end
File.foreach(generated_xcode_build_settings_path) do |line|
matches = line.match(/FLUTTER_ROOT\=(.*)/)
return matches[1].strip if matches
end
raise "FLUTTER_ROOT not found in #{generated_xcode_build_settings_path}. Try deleting Generated.xcconfig, then run flutter pub get"
end
require File.expand_path(File.join('packages', 'flutter_tools', 'bin', 'podhelper'), flutter_root)
flutter_ios_podfile_setup
target 'Runner' do
use_frameworks!
use_modular_headers!
flutter_install_all_ios_pods File.dirname(File.realpath(__FILE__))
target 'RunnerTests' do
inherit! :search_paths
end
end
post_install do |installer|
installer.pods_project.targets.each do |target|
flutter_additional_ios_build_settings(target)
target.build_configurations.each do |config|
# You can remove unused permissions here
# for more information: https://github.com/BaseflowIT/flutter-permission-handler/blob/master/permission_handler/ios/Classes/PermissionHandlerEnums.h
config.build_settings['GCC_PREPROCESSOR_DEFINITIONS'] ||= [
'$(inherited)',
## The 'PERMISSION_LOCATION' macro enables the `locationWhenInUse` and `locationAlways` permission. If
## the application only requires `locationWhenInUse`, only specify the `PERMISSION_LOCATION_WHENINUSE`
## macro.
##
## dart: [PermissionGroup.location, PermissionGroup.locationAlways, PermissionGroup.locationWhenInUse]
'PERMISSION_LOCATION=1',
'PERMISSION_LOCATION_WHENINUSE=0',
]
end
end
end

87
frontend/ios/Podfile.lock Normal file
View File

@@ -0,0 +1,87 @@
PODS:
- Flutter (1.0.0)
- geocoding_ios (1.0.5):
- Flutter
- geolocator_apple (1.2.0):
- Flutter
- Google-Maps-iOS-Utils (6.1.0):
- GoogleMaps (~> 9.0)
- google_maps_flutter_ios (0.0.1):
- Flutter
- Google-Maps-iOS-Utils (< 7.0, >= 5.0)
- GoogleMaps (< 10.0, >= 8.4)
- GoogleMaps (9.2.0):
- GoogleMaps/Maps (= 9.2.0)
- GoogleMaps/Maps (9.2.0)
- map_launcher (0.0.1):
- Flutter
- path_provider_foundation (0.0.1):
- Flutter
- FlutterMacOS
- permission_handler_apple (9.3.0):
- Flutter
- shared_preferences_foundation (0.0.1):
- Flutter
- FlutterMacOS
- sqflite (0.0.3):
- Flutter
- FlutterMacOS
- url_launcher_ios (0.0.1):
- Flutter
DEPENDENCIES:
- Flutter (from `Flutter`)
- geocoding_ios (from `.symlinks/plugins/geocoding_ios/ios`)
- geolocator_apple (from `.symlinks/plugins/geolocator_apple/ios`)
- google_maps_flutter_ios (from `.symlinks/plugins/google_maps_flutter_ios/ios`)
- map_launcher (from `.symlinks/plugins/map_launcher/ios`)
- path_provider_foundation (from `.symlinks/plugins/path_provider_foundation/darwin`)
- permission_handler_apple (from `.symlinks/plugins/permission_handler_apple/ios`)
- shared_preferences_foundation (from `.symlinks/plugins/shared_preferences_foundation/darwin`)
- sqflite (from `.symlinks/plugins/sqflite/darwin`)
- url_launcher_ios (from `.symlinks/plugins/url_launcher_ios/ios`)
SPEC REPOS:
trunk:
- Google-Maps-iOS-Utils
- GoogleMaps
EXTERNAL SOURCES:
Flutter:
:path: Flutter
geocoding_ios:
:path: ".symlinks/plugins/geocoding_ios/ios"
geolocator_apple:
:path: ".symlinks/plugins/geolocator_apple/ios"
google_maps_flutter_ios:
:path: ".symlinks/plugins/google_maps_flutter_ios/ios"
map_launcher:
:path: ".symlinks/plugins/map_launcher/ios"
path_provider_foundation:
:path: ".symlinks/plugins/path_provider_foundation/darwin"
permission_handler_apple:
:path: ".symlinks/plugins/permission_handler_apple/ios"
shared_preferences_foundation:
:path: ".symlinks/plugins/shared_preferences_foundation/darwin"
sqflite:
:path: ".symlinks/plugins/sqflite/darwin"
url_launcher_ios:
:path: ".symlinks/plugins/url_launcher_ios/ios"
SPEC CHECKSUMS:
Flutter: e0871f40cf51350855a761d2e70bf5af5b9b5de7
geocoding_ios: bcbdaa6bddd7d3129c9bcb8acddc5d8778689768
geolocator_apple: d981750b9f47dbdb02427e1476d9a04397beb8d9
Google-Maps-iOS-Utils: 0a484b05ed21d88c9f9ebbacb007956edd508a96
google_maps_flutter_ios: 0291eb2aa252298a769b04d075e4a9d747ff7264
GoogleMaps: 634ec3ca99698b31ca2253d64f017217d70cfb38
map_launcher: fe43bda6720bb73c12fcc1bdd86123ff49a4d4d6
path_provider_foundation: 080d55be775b7414fd5a5ef3ac137b97b097e564
permission_handler_apple: 4ed2196e43d0651e8ff7ca3483a069d469701f2d
shared_preferences_foundation: 9e1978ff2562383bd5676f64ec4e9aa8fa06a6f7
sqflite: c35dad70033b8862124f8337cc994a809fcd9fa3
url_launcher_ios: 694010445543906933d732453a59da0a173ae33d
PODFILE CHECKSUM: bd1a78910c05ac1e3a220e80f392c61ab2cc8789
COCOAPODS: 1.10.2

Some files were not shown because too many files have changed in this diff Show More