126 Commits

Author SHA1 Message Date
e6b4c3cc1a add staging and prod group roles
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-21 16:28:43 -07:00
f46bdfc87f Merge pull request 'Tweak Reviewer Cards' (#345) from review-cards into staging
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
Reviewed-on: #345
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
2026-03-08 00:18:12 +00:00
d7823a82c0 web: fix script review card style
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-07 16:14:01 -08:00
ad8be22b87 web: use pointer for review cards 2026-03-07 16:13:47 -08:00
2f2c51be36 Merge pull request 'Update style to match other StrafesNET sites' (#343) from feature/style-update into staging
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
Reviewed-on: #343
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
2026-03-07 23:54:58 +00:00
9f952d7e54 Update style to match other sites
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-07 18:35:51 -05:00
d26126c9d3 combobulator: use up to 16 parallel requests
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-06 10:07:26 -08:00
b6ac6ce47f validator: switch futures to leaner futures-util 2026-03-06 09:50:12 -08:00
4e8ebd826a combobulator: skip 403
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-06 07:39:28 -08:00
0005a55ae0 combobulator: don't give up for conversion errors
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
2026-03-05 09:45:30 -08:00
4e3048e272 Add Combobulate Endpoint (#338)
Some checks failed
continuous-integration/drone/push Build is failing
continuous-integration/drone/pr Build is passing
Adds the ability to seed a single map for combobulation.

Reviewed-on: #338
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-03-05 17:12:05 +00:00
5549a123a2 update deps, notable rbx_loader
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-05 08:57:33 -08:00
77d43e1e25 combobulator: skip 404
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-04 16:26:45 -08:00
6d9fb5bca6 update deps, notably rbx_loader
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-04 09:30:20 -08:00
2118a8ab35 Add string search for display_name
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-03 20:55:05 -05:00
277cd819c2 Categorize Errors to avoid HTTP 500 (#326)
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
- Depends on #325 (lazyness)
- Closes #148

No guarantees we won't see 500s, but I tried ok

Reviewed-on: #326
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-03-03 17:51:58 +00:00
c22717831d Check for maps with the exact same name on submit (#325)
All checks were successful
continuous-integration/drone/push Build is passing
Closes #273.
Could be better but meh.

Reviewed-on: #325
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-03-03 17:49:08 +00:00
ed8a54370c Limit DisplayName and Creator to 50 characters (#323)
Some checks failed
continuous-integration/drone/push Build is failing
Closes #276

Reviewed-on: #323
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-03-03 17:48:10 +00:00
7756bbb06d update deps, notably rbx_loader & map_tool
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-03 09:38:20 -08:00
3b8da9a8a3 list workspace dependencies 2026-03-03 09:33:58 -08:00
46290834c3 combobulator: save a clone in cold path
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-02 17:28:20 -08:00
ec4e0cf6fa combobulator: use cached assets
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
2026-03-02 17:19:38 -08:00
3dff802bb1 update roblox_emulator to fix infinite luau loops
Some checks failed
continuous-integration/drone/push Build is failing
continuous-integration/drone/pr Build is passing
2026-03-02 16:54:59 -08:00
a6ff551bee combobulator: skip "Asset is not approved for the requester"
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-02 16:24:20 -08:00
dbd28ea87b Rework Combobulator Texture Loading (#329)
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
rbx_loader was attempting to load textures and other assets from disk.  Rework the system to stop implicitly loading from disk.

Reviewed-on: #329
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-03-02 23:59:40 +00:00
0efb07b52a Merge remote-tracking branch 'origin/master' into staging
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-02 10:24:12 -08:00
d4e50c2d37 Add SNFM download endpoints and batch seed endpoint (#328)
All checks were successful
continuous-integration/drone/push Build is passing
Adds download endpoint to RPC and Public API
Adds bulk seed endpoint: POST /v1/maps-admin/seed-combobulator

Reviewed-on: #328
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2026-03-02 02:39:41 +00:00
501b0933e6 Remove unused struct
Some checks failed
continuous-integration/drone/pr Build is failing
continuous-integration/drone/push Build is passing
2026-03-01 20:11:22 -05:00
078a3e4c4a Add map seed endpoint
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-01 17:41:11 -05:00
02873e82b6 Just use normal asset download
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-01 17:10:53 -05:00
0e2ffcd570 Doc updates
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-01 16:39:58 -05:00
c788344bf3 Add snfm download endpoints
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-01 16:28:37 -05:00
0711774153 Merge pull request 'Deploy staging' (#327) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #327
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
2026-03-01 20:48:31 +00:00
a8f44179a3 Don't version maps
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-03-01 15:21:34 -05:00
e1862d3917 Handle archived assets
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 15:05:10 -05:00
da96f1a090 I love gzip
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 14:49:16 -05:00
05c1107e91 Validator fixes
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 13:55:33 -05:00
83e257a4d5 Change rettention policy
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 13:39:02 -05:00
b197791509 Debian fix
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 13:24:10 -05:00
91c2d87d2f ssl fix
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 13:18:50 -05:00
225e095c92 Why do I do this to myself
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 01:41:44 -05:00
2a6099480e Drop alpine
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 01:37:48 -05:00
d6074c4b78 Compile fixes
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 01:33:41 -05:00
f3a677dc20 Attempt openssl fixes
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 01:29:26 -05:00
f4209ecd0a Increase timeout to 15 min
Some checks failed
continuous-integration/drone/push Build is failing
2026-03-01 01:23:09 -05:00
e83c9db866 Attempt combobulation
All checks were successful
continuous-integration/drone/push Build is passing
2026-03-01 01:19:27 -05:00
7bec80a2fe Reviewer Dashboard: Click overview cards to change tab (#322)
Some checks failed
continuous-integration/drone/push Build is failing
I keep thinking I can click the cards to select submissions or mapfixes, so make it actually do that.

Reviewed-on: #322
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-01-16 21:14:19 +00:00
3e77edb1cc Update Submission Button (#321)
Some checks failed
continuous-integration/drone/push Build is failing
Closes #318

Reviewed-on: #321
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-01-16 21:12:55 +00:00
0b7ca534f3 Add Maptest Integration GRPC Controllers (#317)
Some checks failed
continuous-integration/drone/push Build is failing
Creates a GRPC controller for Submissions and Mapfixes.  This is intended to be used from the AOR group games via game-rpc running in "maptest" mode.

Reviewed-on: #317
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2026-01-07 20:32:10 +00:00
de864ac8ef Merge pull request 'Deploy Upload Escape Hatch' (#320) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #320
2026-01-06 19:20:38 +00:00
264ce38c08 Merge pull request 'Upload->Release Escape Hatch' (#319) from reject-upload into staging
Some checks failed
continuous-integration/drone/push Build is failing
continuous-integration/drone/pr Build is passing
Reviewed-on: #319
2026-01-06 19:18:18 +00:00
b1e10dc50e web: show RequestChanges button in Uploaded status
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2026-01-06 11:13:41 -08:00
755616f46c backend: allow request changes for uploaded models 2026-01-06 11:10:37 -08:00
9d9ab20952 Merge pull request 'Deploy nudges and action confirmation' (#311) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #311
2025-12-28 02:06:18 +00:00
e41d34dd3d Group buttons and add confirmation dialogues (#310)
Some checks failed
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is failing
Reviewer:
<img width="409" alt="image.png" src="attachments/a090c61e-a2d8-4685-ae64-547851d1ee84">
Submitter:
<img width="404" alt="image.png" src="attachments/9205a438-1f1f-4af4-b9a0-6a8d56580afa">
<img width="411" alt="image.png" src="attachments/7ae8115b-3376-4306-b9b9-acc12226abb3">
Admin:
<img width="392" alt="image.png" src="attachments/07a182d1-5375-4195-bfda-c14f09469cbe">
<img width="388" alt="image.png" src="attachments/ce82017d-5c1d-4a93-9247-9b5608f9030e">

Confirmation Dialogue:
<img width="545" alt="image.png" src="attachments/1efff8be-1d41-429e-8c6e-3d36b7dad128">

Example where both groups show up:
<img width="404" alt="image.png" src="attachments/b0ca4be2-7c58-4c0c-9a5f-dcd89e23b08f">

Reviewed-on: #310
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-28 00:34:58 +00:00
f49e27e230 Support editing map fix descriptions (#309)
All checks were successful
continuous-integration/drone/push Build is passing
The description can be edited by the **submitter** only if the status is Changes Requested or Under Construction.

<img width="734" alt="image.png" src="attachments/9fd7b838-f946-4091-a396-ef66f5e655bc">
<img width="724" alt="image.png" src="attachments/f65f059e-af97-448a-9627-fee827d30e59">

Reviewed-on: #309
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 23:40:42 +00:00
d500462fc7 Add user nudges for certain statuses (#308)
Some checks failed
continuous-integration/drone/push Build is failing
Will show a badge icon on the audit tab if there are any validator errors/checklists to direct attention to it. Will show nudge message ONLY to the submitter.

![image.png](/attachments/f5cd9ab6-b996-40b2-ad43-fa5e9b28caf5)
![image.png](/attachments/9aba2132-ec85-4ae9-b0fa-be253ecc2355)

Closes !205

Reviewed-on: #308
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 23:30:38 +00:00
ee2bc94312 Add releasing status to the processing list (#307)
All checks were successful
continuous-integration/drone/push Build is passing
Closes !269

Reviewed-on: #307
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 22:25:39 +00:00
67ece176c6 Merge pull request 'Deploy script review update' (#306) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #306
2025-12-27 21:49:28 +00:00
84edc71574 Add game name to review page (#305)
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
Deduped all the game name usage to a single lib. Closes !281

<img width="785" alt="image.png" src="attachments/0f226438-fed1-40b2-81a9-2988dd2d4a33">

Reviewed-on: #305
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 19:56:33 +00:00
7c5d8a2163 Add script review page (#304)
All checks were successful
continuous-integration/drone/push Build is passing
Closes !2

Added review dashboard button as well.

<img width="1313" alt="image.png" src="attachments/a2abd430-7ff6-431a-9261-82e026de58f5">

![image.png](/attachments/e1ba3536-2869-4661-b46c-007ddaff8f3e)

Reviewed-on: #304
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 19:56:19 +00:00
b31b3bed5f Merge pull request 'Deploy workflow timeline' (#302) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #302
2025-12-27 08:28:42 +00:00
7eaa84a0ed Change Timeline Text (#301)
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
Some tweaks to the descriptions.  Evidently I didn't read carefully enough.

Reviewed-on: #301
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2025-12-27 08:19:17 +00:00
cf0cf9da7a Add workflow timeline (#300)
All checks were successful
continuous-integration/drone/push Build is passing
Closes !232

<img width="763" alt="image.png" src="attachments/559715f5-630e-4029-a19b-c9f4cf4c7270">

Reviewed-on: #300
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 08:04:02 +00:00
34cd1c7c26 Merge pull request 'Deploy dashboard update' (#299) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #299
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
2025-12-27 05:41:53 +00:00
74565e567a Fix "0" displaying in "Review Dashboard" button on user dashboard (#298)
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
The review dashboard link only shows when the user has the correct roles. A normal user would not see the button but instead the text "0".

Reviewed-on: #298
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 05:39:33 +00:00
ea65794255 Cycle before and after images every 1.5 seconds (#295)
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
The images should auto cycle now that the thumbnails are working.

I don't know how to test this!  This is what I tried:
```
bun install
bun run build
VITE_API_HOST=https://maps.staging.strafes.net/v1 bun run preview
```
but the mapfixes page won't load the mapfixes.

Reviewed-on: #295
Reviewed-by: itzaname <itzaname@noreply@itzana.me>
Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Co-committed-by: Rhys Lloyd <krakow20@gmail.com>
2025-12-27 05:26:04 +00:00
58706a5687 Add user/reviewer dashboard (#297)
All checks were successful
continuous-integration/drone/push Build is passing
Adds "at a glance" dashboard so life is less painful.

![image.png](/attachments/43e83777-7196-4274-9adc-e1268e43bc0f)
![image.png](/attachments/1cbe99ab-50b8-443a-aa48-ad9107ccfb1e)

Reviewed-on: #297
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-27 05:20:45 +00:00
efeb525e19 Merge pull request 'Add mapfix history on maps page' (#294) from feature/mapfix-list into staging
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #294
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
2025-12-27 04:51:03 +00:00
5a1fe60a7b fix quat docker
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-26 19:13:03 -08:00
01cfe67848 Just exclude rejected and released for active list
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2025-12-26 20:38:18 -05:00
a19bc4d380 Add mapfix history on maps page
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-26 20:32:55 -05:00
ae006565d6 Merge pull request 'Fix overflow on mapfix/submission' (#293) from fix/overflow into staging
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #293
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
2025-12-27 00:44:26 +00:00
57bca99109 Fix overflow
All checks were successful
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is passing
2025-12-26 19:42:36 -05:00
058455efd2 Merge pull request 'Deploy updates' (#291) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #291
2025-12-26 04:46:53 +00:00
cd09c9b18e Populate username for map fixes by author id
Some checks failed
continuous-integration/drone/push Build is failing
continuous-integration/drone/pr Build is passing
2025-12-25 20:42:22 -08:00
e48cbaff72 Make maps behave like normal link 2025-12-25 20:42:22 -08:00
140d58b808 Make comments support newlines 2025-12-25 20:42:22 -08:00
ba761549b8 Force dark theme 2025-12-25 20:42:22 -08:00
86643fef8d Merge branch 'master' into staging 2025-12-25 20:42:18 -08:00
96af864c5e Deploy staging to prod (#286)
All checks were successful
continuous-integration/drone/push Build is passing
Pull in validator changes and full ui rework to remove nextjs.

Co-authored-by: Rhys Lloyd <krakow20@gmail.com>
Reviewed-on: #286
Reviewed-by: Rhys Lloyd <quaternions@noreply@itzana.me>
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-26 03:30:36 +00:00
7db89fd99b Fix bun lock file
All checks were successful
continuous-integration/drone/pr Build is passing
continuous-integration/drone/push Build is passing
2025-12-25 22:10:29 -05:00
f2bb1b078d Fix content width and standardize on skeleton loading
Some checks failed
continuous-integration/drone/push Build is passing
continuous-integration/drone/pr Build is failing
2025-12-25 21:37:23 -05:00
66878fba4e Switch loading text to skeleton
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 21:02:15 -05:00
bda99550be Fix submission icon 2025-12-25 21:00:28 -05:00
8a216c7e82 Add username api
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 20:55:15 -05:00
e5277c05a1 Avatar image loading
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 20:38:17 -05:00
e4af76cfd4 Fix api endpoint
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 20:22:24 -05:00
30db1cc375 Fix the build issues
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 19:52:01 -05:00
b50c84f8cf Use port 3000
Some checks failed
continuous-integration/drone/push Build is failing
2025-12-25 19:49:52 -05:00
7589ef7df6 Fix dockerfile for spa
Some checks failed
continuous-integration/drone/push Build was killed
2025-12-25 19:49:06 -05:00
8ab8c441b0 Home page and header fixes
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-25 19:45:16 -05:00
a26b228ebe Add 404 page 2025-12-25 19:45:16 -05:00
3654755540 Thumbnail/nav cleanup 2025-12-25 19:45:16 -05:00
c2b50ffab2 Cleanup home/nav 2025-12-25 19:45:16 -05:00
75756917b1 some theming 2025-12-25 19:45:16 -05:00
8989c08857 theme 2025-12-25 19:45:16 -05:00
b2232f4177 Initial work to nuke nextjs 2025-12-25 19:45:16 -05:00
7d1c4d2b6c Add stats endpoint
Some checks failed
continuous-integration/drone Build was killed
continuous-integration/drone/push Build is passing
2025-12-25 18:58:52 -05:00
ca401d4b96 Add batch thumbnail endpoint (#285)
All checks were successful
continuous-integration/drone/push Build is passing
Step 1 of eliminating nextjs is adding a way to query thumbnails from roblox since nextjs handles that. This implements a batch endpoint and caching to do that. Bonus: thumbnails will actually work once we start using this.

Reviewed-on: #285
Co-authored-by: itzaname <me@sliving.io>
Co-committed-by: itzaname <me@sliving.io>
2025-12-25 22:56:59 +00:00
9ab80931bf remove unfulfilled lints
All checks were successful
continuous-integration/drone/push Build is passing
2025-12-09 14:34:39 -08:00
09022e7292 change allow to expect 2025-12-09 14:34:16 -08:00
47c0fff0ec Merge pull request 'Update javascript' (#283) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #283
2025-12-06 04:48:21 +00:00
b7c28616ad Merge pull request 'submissions: Fix Maps.Update Date + Release Date Mixup' (#282) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #282
2025-09-29 02:14:51 +00:00
89ab25dfb9 Merge pull request 'deploy fixes' (#279) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #279
2025-09-23 22:41:35 +00:00
b0b5ff0725 Merge pull request 'web: add missing button lost in refactor' (#275) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #275
2025-09-17 00:12:15 +00:00
0532965d37 Merge pull request 'Maps Metadata Maintenance' (#267) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #267
2025-08-16 06:24:19 +00:00
f59979987f Merge pull request 'Deploy Public API' (#256) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #256
2025-08-08 22:07:37 +00:00
a232269d54 Merge pull request 'Extend Web API Maps With New Fields' (#250) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #250
2025-07-26 03:29:09 +00:00
a7c4ca4b49 Merge pull request 'Implement Maps' (#248) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #248
2025-07-26 01:26:26 +00:00
ca9f82a5aa Merge pull request 'Set Download File Name' (#245) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #245
2025-07-23 09:32:27 +00:00
e1a2f6f075 Merge pull request 'Fix gRPC' (#244) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #244
2025-07-23 04:53:29 +00:00
dad904cd86 Merge pull request 'Convert Validator API to gRPC' (#239) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #239
2025-07-22 04:32:04 +00:00
ad7117a69c Merge pull request 'Scream Test Backend Overhaul' (#237) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #237
2025-07-18 06:28:18 +00:00
d566591ea6 Merge pull request 'Fix Audit Event Order + Check Unanchored Parts' (#234) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #234
2025-07-16 06:47:29 +00:00
424ef6238b Merge pull request 'Prevent Mapfix Duplicates + Correctly Report Transaction Errors' (#221) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #221
2025-07-01 12:40:26 +00:00
0f0ab4d3e0 Merge pull request 'Update Roblox Api + Update Deps' (#217) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #217
2025-07-01 08:47:27 +00:00
3e2d782289 Merge pull request 'QoL Web Changes + Map Download Permission Fix' (#214) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #214
2025-06-30 10:20:03 +00:00
dc446c545f Fix Bypass Submit + Audit Checklist + Map Download Button (#207)
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #207
2025-06-24 06:41:56 +00:00
e234a87d05 Replace Bypass Submit With Submit Unchecked + Error Endpoint (#200)
Some checks are pending
continuous-integration/drone/push Build is running
Reviewed-on: #200
Co-authored-by: Quaternions <krakow20@gmail.com>
Co-committed-by: Quaternions <krakow20@gmail.com>
2025-06-23 23:39:18 -07:00
8ab772ea81 Validate Asset Version + Website QoL + Script Names Fix (#193)
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #193
2025-06-10 23:53:07 +00:00
9b58b1d26a Frontend Rework (#185)
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #185
2025-06-09 01:09:17 +00:00
7689001e74 Merge pull request '404 / 500 Thumbnails + Fix Regex Capture Groups' (#168) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #168
2025-06-07 04:02:26 +00:00
e89abed3d5 Merge pull request 'Thumbnail Fixes + Bypass Submit Button' (#161) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #161
2025-06-05 01:34:35 +00:00
b792d33164 Merge pull request 'Update Rust Dependencies (Roblox Format Zstd Support)' (#142) from staging into master
All checks were successful
continuous-integration/drone/push Build is passing
Reviewed-on: #142
2025-06-01 23:13:58 +00:00
929b5949f0 Merge pull request 'Snapshot "Working" Code' (#139) from staging into master
Some checks failed
continuous-integration/drone/push Build is failing
Reviewed-on: #139
2025-04-27 21:21:05 +00:00
154 changed files with 26205 additions and 3326 deletions

View File

@@ -24,7 +24,7 @@ steps:
- staging
- name: build-validator
image: clux/muslrust:1.91.0-stable
image: rust:1.92
commands:
- make build-validator
when:
@@ -32,6 +32,15 @@ steps:
- master
- staging
- name: build-combobulator
image: rust:1.92
commands:
- make build-combobulator
when:
branch:
- master
- staging
- name: build-frontend
image: oven/bun:1.3.3
commands:
@@ -112,6 +121,29 @@ steps:
event:
- push
- name: image-combobulator
image: plugins/docker
settings:
registry: registry.itzana.me
repo: registry.itzana.me/strafesnet/maptest-combobulator
tags:
- ${DRONE_BRANCH}-${DRONE_BUILD_NUMBER}
- ${DRONE_BRANCH}
username:
from_secret: REGISTRY_USER
password:
from_secret: REGISTRY_PASS
dockerfile: combobulator/Containerfile
context: .
depends_on:
- build-combobulator
when:
branch:
- master
- staging
event:
- push
- name: deploy
image: argoproj/argocd:latest
commands:
@@ -119,6 +151,7 @@ steps:
- argocd app --grpc-web set ${DRONE_BRANCH}-maps-service --kustomize-image registry.itzana.me/strafesnet/maptest-api:${DRONE_BRANCH}-${DRONE_BUILD_NUMBER}
- argocd app --grpc-web set ${DRONE_BRANCH}-maps-service --kustomize-image registry.itzana.me/strafesnet/maptest-frontend:${DRONE_BRANCH}-${DRONE_BUILD_NUMBER}
- argocd app --grpc-web set ${DRONE_BRANCH}-maps-service --kustomize-image registry.itzana.me/strafesnet/maptest-validator:${DRONE_BRANCH}-${DRONE_BUILD_NUMBER}
- argocd app --grpc-web set ${DRONE_BRANCH}-maps-service --kustomize-image registry.itzana.me/strafesnet/maptest-combobulator:${DRONE_BRANCH}-${DRONE_BUILD_NUMBER}
environment:
USERNAME:
from_secret: ARGO_USER
@@ -128,6 +161,7 @@ steps:
- image-backend
- image-frontend
- image-validator
- image-combobulator
when:
branch:
- master
@@ -143,12 +177,13 @@ steps:
depends_on:
- build-backend
- build-validator
- build-combobulator
- build-frontend
when:
event:
- pull_request
---
kind: signature
hmac: 6de9d4b91f14b30561856daf275d1fd523e1ce7a5a3651b660f0d8907b4692fb
hmac: a654fea05ccf642bb3a41ce777808ff995c8bd7286f2403fae179ce0db025619
...

3011
Cargo.lock generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,17 @@
[workspace]
members = [
"combobulator",
"validation",
"submissions-api-rs",
]
resolver = "2"
[workspace.dependencies]
async-nats = "0.46.0"
futures-util = "0.3.31"
rbx_asset = { version = "0.5.0", features = ["gzip", "rustls-tls"], default-features = false, registry = "strafesnet" }
rbx_binary = "2.0.1"
rbx_dom_weak = "4.1.0"
serde = { version = "1.0.215", features = ["derive"] }
serde_json = "1.0.133"
tokio = { version = "1.41.1", features = ["macros", "rt-multi-thread", "signal"] }

View File

@@ -7,14 +7,17 @@ build-backend:
CGO_ENABLED=0 GOOS=linux go build -a -installsuffix cgo -o build/server cmd/maps-service/service.go
build-validator:
cargo build --release --target x86_64-unknown-linux-musl --bin maps-validation
cargo build --release --bin maps-validation
build-combobulator:
cargo build --release --bin maps-combobulator
build-frontend:
rm -rf web/build
cd web && bun install --frozen-lockfile
cd web && bun run build
build: build-backend build-validator build-frontend
build: build-backend build-validator build-combobulator build-frontend
# image
image-backend:
@@ -23,6 +26,9 @@ image-backend:
image-validator:
docker build . -f validation/Containerfile -t maptest-validator
image-combobulator:
docker build . -f combobulator/Containerfile -t maptest-combobulator
image-frontend:
docker build web -f web/Containerfile -t maptest-frontend
@@ -33,9 +39,12 @@ docker-backend:
docker-validator:
make build-validator
make image-validator
docker-combobulator:
make build-combobulator
make image-combobulator
docker-frontend:
make image-frontend
docker: docker-backend docker-validator docker-frontend
docker: docker-backend docker-validator docker-combobulator docker-frontend
.PHONY: clean build-backend build-validator build-frontend build image-backend image-validator image-frontend docker-backend docker-validator docker-frontend docker
.PHONY: clean build-backend build-validator build-combobulator build-frontend build image-backend image-validator image-combobulator image-frontend docker-backend docker-validator docker-combobulator docker-frontend docker

22
combobulator/Cargo.toml Normal file
View File

@@ -0,0 +1,22 @@
[package]
name = "maps-combobulator"
version = "0.1.0"
edition = "2024"
[dependencies]
async-nats.workspace = true
aws-config = { version = "1", features = ["behavior-version-latest"] }
aws-sdk-s3 = "1"
futures-util.workspace = true
map-tool = { version = "3.0.0", registry = "strafesnet", features = ["roblox"], default-features = false }
rbx_asset.workspace = true
rbx_binary.workspace = true
rbx_dom_weak.workspace = true
rbxassetid = { version = "0.1.0", registry = "strafesnet" }
serde.workspace = true
serde_json.workspace = true
strafesnet_deferred_loader = { version = "0.6.0", registry = "strafesnet" }
strafesnet_rbx_loader = { version = "0.10.0", registry = "strafesnet" }
strafesnet_snf = { version = "0.3.2", registry = "strafesnet" }
tokio.workspace = true
tokio-stream = "0.1"

View File

@@ -0,0 +1,4 @@
FROM debian:trixie-slim AS runtime
RUN apt-get update && apt-get install -y --no-install-recommends libssl3t64 ca-certificates && rm -rf /var/lib/apt/lists/*
COPY /target/release/maps-combobulator /
ENTRYPOINT ["/maps-combobulator"]

152
combobulator/src/loader.rs Normal file
View File

@@ -0,0 +1,152 @@
use std::collections::HashMap;
use rbxassetid::{RobloxAssetId,RobloxAssetIdParseErr};
use strafesnet_deferred_loader::{loader::Loader,texture::Texture};
use strafesnet_rbx_loader::mesh::{MeshIndex,MeshType,MeshWithSize};
// disallow non-static lifetimes
fn static_ustr(s:&'static str)->rbx_dom_weak::Ustr{
rbx_dom_weak::ustr(s)
}
#[expect(dead_code)]
#[derive(Debug)]
pub enum TextureError{
NoTexture,
RobloxAssetIdParse(RobloxAssetIdParseErr),
}
impl std::fmt::Display for TextureError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for TextureError{}
impl From<RobloxAssetIdParseErr> for TextureError{
fn from(value:RobloxAssetIdParseErr)->Self{
Self::RobloxAssetIdParse(value)
}
}
pub struct TextureLoader{
textures:HashMap<RobloxAssetId,Texture>,
}
impl TextureLoader{
pub fn new()->Self{
Self{
textures:HashMap::new(),
}
}
pub fn insert(&mut self,asset_id:RobloxAssetId,texture:Vec<u8>){
self.textures.insert(asset_id,Texture::ImageDDS(texture));
}
}
impl Loader for TextureLoader{
type Error=TextureError;
type Index<'a>=&'a str;
type Resource=Texture;
fn load(&mut self,index:Self::Index<'_>)->Result<Self::Resource,Self::Error>{
let asset_id:RobloxAssetId=index.parse()?;
let data=self.textures.get(&asset_id).ok_or(TextureError::NoTexture)?.clone();
Ok(data)
}
}
#[expect(dead_code)]
#[derive(Debug)]
pub enum MeshError{
NoMesh,
RobloxAssetIdParse(RobloxAssetIdParseErr),
Mesh(strafesnet_rbx_loader::mesh::Error),
Union(strafesnet_rbx_loader::union::Error),
DecodeBinary(rbx_binary::DecodeError),
OneChildPolicy,
MissingInstance,
}
impl std::fmt::Display for MeshError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for MeshError{}
impl From<RobloxAssetIdParseErr> for MeshError{
fn from(value:RobloxAssetIdParseErr)->Self{
Self::RobloxAssetIdParse(value)
}
}
impl From<strafesnet_rbx_loader::mesh::Error> for MeshError{
fn from(value:strafesnet_rbx_loader::mesh::Error)->Self{
Self::Mesh(value)
}
}
impl From<strafesnet_rbx_loader::union::Error> for MeshError{
fn from(value:strafesnet_rbx_loader::union::Error)->Self{
Self::Union(value)
}
}
impl From<rbx_binary::DecodeError> for MeshError{
fn from(value:rbx_binary::DecodeError)->Self{
Self::DecodeBinary(value)
}
}
pub struct MeshLoader{
meshes:HashMap<RobloxAssetId,MeshWithSize>,
unions:HashMap<RobloxAssetId,rbx_dom_weak::WeakDom>,
}
impl MeshLoader{
pub fn new()->Self{
Self{
meshes:HashMap::new(),
unions:HashMap::new(),
}
}
pub fn insert_mesh(&mut self,asset_id:RobloxAssetId,mesh:MeshWithSize){
self.meshes.insert(asset_id,mesh);
}
pub fn insert_union(&mut self,asset_id:RobloxAssetId,union:rbx_dom_weak::WeakDom){
self.unions.insert(asset_id,union);
}
}
impl Loader for MeshLoader{
type Error=MeshError;
type Index<'a>=MeshIndex<'a>;
type Resource=MeshWithSize;
fn load(&mut self,index:Self::Index<'_>)->Result<Self::Resource,Self::Error>{
let mesh=match index.mesh_type{
MeshType::FileMesh=>{
let id:RobloxAssetId=index.content.parse()?;
let mesh_with_size=self.meshes.get(&id).ok_or(MeshError::NoMesh)?;
mesh_with_size.clone()
},
MeshType::Union{mut physics_data,mut mesh_data,size_float_bits,part_texture_description}=>{
// decode asset
let size=size_float_bits.map(f32::from_bits).into();
if !index.content.is_empty()&&(physics_data.is_empty()||mesh_data.is_empty()){
let id:RobloxAssetId=index.content.parse()?;
let dom=self.unions.get(&id).ok_or(MeshError::NoMesh)?;
let &[referent]=dom.root().children()else{
return Err(MeshError::OneChildPolicy);
};
let Some(instance)=dom.get_by_ref(referent)else{
return Err(MeshError::MissingInstance);
};
if physics_data.is_empty(){
if let Some(rbx_dom_weak::types::Variant::BinaryString(data))=instance.properties.get(&static_ustr("PhysicsData")){
physics_data=data.as_ref();
}
}
if mesh_data.is_empty(){
if let Some(rbx_dom_weak::types::Variant::BinaryString(data))=instance.properties.get(&static_ustr("MeshData")){
mesh_data=data.as_ref();
}
}
strafesnet_rbx_loader::union::convert(physics_data,mesh_data,size,part_texture_description)?
}else{
strafesnet_rbx_loader::union::convert(physics_data,mesh_data,size,part_texture_description)?
}
},
};
Ok(mesh)
}
}

165
combobulator/src/main.rs Normal file
View File

@@ -0,0 +1,165 @@
use tokio_stream::StreamExt;
mod loader;
mod nats_types;
mod process;
mod s3;
const SUBJECT_MAPFIX_RELEASE:&str="maptest.mapfixes.release";
const SUBJECT_SUBMISSION_BATCHRELEASE:&str="maptest.submissions.batchrelease";
const SUBJECT_SEED:&str="maptest.combobulator.seed";
#[derive(Debug)]
pub enum StartupError{
NatsConnect(async_nats::ConnectError),
NatsGetStream(async_nats::jetstream::context::GetStreamError),
NatsConsumer(async_nats::jetstream::stream::ConsumerError),
NatsConsumerUpdate(async_nats::jetstream::stream::ConsumerUpdateError),
NatsStream(async_nats::jetstream::consumer::StreamError),
}
impl std::fmt::Display for StartupError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for StartupError{}
#[expect(dead_code)]
#[derive(Debug)]
enum HandleMessageError{
Json(serde_json::Error),
UnknownSubject(String),
Process(process::Error),
Ack(async_nats::Error),
Publish(async_nats::jetstream::context::PublishError),
}
impl std::fmt::Display for HandleMessageError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for HandleMessageError{}
fn from_slice<'a,T:serde::de::Deserialize<'a>>(slice:&'a [u8])->Result<T,HandleMessageError>{
serde_json::from_slice(slice).map_err(HandleMessageError::Json)
}
async fn handle_message(
processor:&process::Processor,
jetstream:&async_nats::jetstream::Context,
message:async_nats::jetstream::Message,
)->Result<(),HandleMessageError>{
match message.subject.as_str(){
SUBJECT_MAPFIX_RELEASE=>{
let request:nats_types::ReleaseMapfixRequest=from_slice(&message.payload)?;
processor.handle_mapfix_release(request).await.map_err(HandleMessageError::Process)?;
message.ack().await.map_err(HandleMessageError::Ack)?;
},
SUBJECT_SUBMISSION_BATCHRELEASE=>{
// split batch into individual seed messages
let batch:nats_types::ReleaseSubmissionsBatchRequest=from_slice(&message.payload)?;
println!("[combobulator] Splitting batch release (operation {}, {} submissions)",
batch.OperationID,batch.Submissions.len());
for submission in batch.Submissions{
let seed=nats_types::SeedCombobulatorRequest{AssetID:submission.UploadedAssetID};
let payload=serde_json::to_vec(&seed).map_err(HandleMessageError::Json)?;
jetstream.publish(SUBJECT_SEED,payload.into())
.await.map_err(HandleMessageError::Publish)?;
println!("[combobulator] Queued seed for asset {}",seed.AssetID);
}
message.ack().await.map_err(HandleMessageError::Ack)?;
},
SUBJECT_SEED=>{
let request:nats_types::SeedCombobulatorRequest=from_slice(&message.payload)?;
processor.handle_seed(request).await.map_err(HandleMessageError::Process)?;
message.ack().await.map_err(HandleMessageError::Ack)?;
},
other=>return Err(HandleMessageError::UnknownSubject(other.to_owned())),
}
println!("[combobulator] Message processed and acked");
Ok(())
}
#[tokio::main]
async fn main()->Result<(),StartupError>{
// roblox cookie api for downloading assets
let cookie=std::env::var("RBXCOOKIE").expect("RBXCOOKIE env required");
let cookie_context=rbx_asset::cookie::Context::new(rbx_asset::cookie::Cookie::new(cookie));
// s3
let s3_bucket=std::env::var("S3_BUCKET").expect("S3_BUCKET env required");
let s3_config=aws_config::load_defaults(aws_config::BehaviorVersion::latest()).await;
let s3_client=aws_sdk_s3::Client::new(&s3_config);
let s3_cache=s3::S3Cache::new(s3_client,s3_bucket);
let processor=process::Processor{
cookie_context,
s3:s3_cache,
};
// nats
let nats_host=std::env::var("NATS_HOST").expect("NATS_HOST env required");
const STREAM_NAME:&str="maptest";
const DURABLE_NAME:&str="combobulator";
let filter_subjects=vec![
SUBJECT_MAPFIX_RELEASE.to_owned(),
SUBJECT_SUBMISSION_BATCHRELEASE.to_owned(),
SUBJECT_SEED.to_owned(),
];
let nats_config=async_nats::jetstream::consumer::pull::Config{
name:Some(DURABLE_NAME.to_owned()),
durable_name:Some(DURABLE_NAME.to_owned()),
filter_subjects:filter_subjects.clone(),
ack_wait:std::time::Duration::from_secs(900), // 15 minutes for processing
max_deliver:5, // retry up to 5 times
..Default::default()
};
let nasty=async_nats::connect(nats_host).await.map_err(StartupError::NatsConnect)?;
let jetstream=async_nats::jetstream::new(nasty);
let stream=jetstream.get_stream(STREAM_NAME).await.map_err(StartupError::NatsGetStream)?;
let consumer=stream.get_or_create_consumer(DURABLE_NAME,nats_config.clone()).await.map_err(StartupError::NatsConsumer)?;
// update consumer config if filter subjects changed
if consumer.cached_info().config.filter_subjects!=filter_subjects{
stream.update_consumer(nats_config).await.map_err(StartupError::NatsConsumerUpdate)?;
}
let mut messages=consumer.messages().await.map_err(StartupError::NatsStream)?;
// SIGTERM graceful shutdown
let mut sig_term=tokio::signal::unix::signal(tokio::signal::unix::SignalKind::terminate())
.expect("Failed to create SIGTERM signal listener");
println!("[combobulator] Started, waiting for messages...");
// sequential processing loop - one message at a time
let main_loop=async{
while let Some(message_result)=messages.next().await{
match message_result{
Ok(message)=>{
match handle_message(&processor,&jetstream,message).await{
Ok(())=>println!("[combobulator] Success"),
Err(e)=>println!("[combobulator] Error: {e}"),
}
},
Err(e)=>println!("[combobulator] Message stream error: {e}"),
}
}
};
tokio::select!{
_=sig_term.recv()=>{
println!("[combobulator] Received SIGTERM, shutting down");
},
_=main_loop=>{
println!("[combobulator] Message stream ended");
},
};
Ok(())
}

View File

@@ -0,0 +1,35 @@
#[expect(nonstandard_style,dead_code)]
#[derive(serde::Deserialize)]
pub struct ReleaseMapfixRequest{
pub MapfixID:u64,
pub ModelID:u64,
pub ModelVersion:u64,
pub TargetAssetID:u64,
}
#[expect(nonstandard_style,dead_code)]
#[derive(serde::Deserialize)]
pub struct ReleaseSubmissionRequest{
pub SubmissionID:u64,
pub ReleaseDate:i64,
pub ModelID:u64,
pub ModelVersion:u64,
pub UploadedAssetID:u64,
pub DisplayName:String,
pub Creator:String,
pub GameID:u32,
pub Submitter:u64,
}
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ReleaseSubmissionsBatchRequest{
pub Submissions:Vec<ReleaseSubmissionRequest>,
pub OperationID:u32,
}
#[expect(nonstandard_style)]
#[derive(serde::Deserialize,serde::Serialize)]
pub struct SeedCombobulatorRequest{
pub AssetID:u64,
}

280
combobulator/src/process.rs Normal file
View File

@@ -0,0 +1,280 @@
use std::io::Cursor;
use crate::nats_types::ReleaseMapfixRequest;
use crate::s3::S3Cache;
use futures_util::stream::iter as stream_iter;
use futures_util::{StreamExt,TryStreamExt};
use strafesnet_deferred_loader::deferred_loader::LoadFailureMode;
const CONCURRENT_REQUESTS:usize=16;
#[expect(dead_code)]
#[derive(Debug)]
pub enum ConvertError{
IO(std::io::Error),
SNFMap(strafesnet_snf::map::Error),
RobloxLoadMesh(super::loader::MeshError),
RobloxLoadTexture(super::loader::TextureError),
}
impl std::fmt::Display for ConvertError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for ConvertError{}
pub fn convert_to_snf(
dom:rbx_dom_weak::WeakDom,
mut mesh_loader:crate::loader::MeshLoader,
mut texture_loader:crate::loader::TextureLoader,
)->Result<Vec<u8>,ConvertError>{
const FAILURE_MODE:LoadFailureMode=LoadFailureMode::DefaultToNone;
// run scripts
let model=strafesnet_rbx_loader::Model::new(dom);
let mut place=strafesnet_rbx_loader::Place::from(model);
// TODO: script errors report for burn down chart
let _script_errors=place.run_scripts().unwrap_or_else(|e|vec![e]);
// convert
let mut texture_deferred_loader=strafesnet_deferred_loader::deferred_loader::RenderConfigDeferredLoader::new();
let mut mesh_deferred_loader=strafesnet_deferred_loader::deferred_loader::MeshDeferredLoader::new();
let map_step1=strafesnet_rbx_loader::rbx::convert(
place.as_ref(),
&mut texture_deferred_loader,
&mut mesh_deferred_loader,
);
let meshpart_meshes=mesh_deferred_loader.into_meshes(&mut mesh_loader,FAILURE_MODE).map_err(ConvertError::RobloxLoadMesh)?;
let map_step2=map_step1.add_meshpart_meshes_and_calculate_attributes(meshpart_meshes);
let render_configs=texture_deferred_loader.into_render_configs(&mut texture_loader,FAILURE_MODE).map_err(ConvertError::RobloxLoadTexture)?;
// TODO: conversion error report for burn down chart
let (map,_convert_errors)=map_step2.add_render_configs_and_textures(render_configs);
let mut snf_buf=Vec::new();
strafesnet_snf::map::write_map(Cursor::new(&mut snf_buf),map).map_err(ConvertError::SNFMap)?;
Ok(snf_buf)
}
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ArchivedModel,
LoadDom(map_tool::roblox::LoadDomError),
DownloadAsset(map_tool::roblox::DownloadAssetError),
ConvertSnf(ConvertError),
S3Get(crate::s3::GetError),
S3Put(crate::s3::PutError),
}
impl std::fmt::Display for Error{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for Error{}
pub struct Processor{
pub cookie_context:rbx_asset::cookie::Context,
pub s3:S3Cache,
}
impl Processor{
/// Download an asset, returning None if the asset is archived.
async fn download_asset(&self,asset_id:u64)->Result<Option<Vec<u8>>,Error>{
match map_tool::roblox::download_asset(&self.cookie_context,asset_id).await{
Ok(data)=>Ok(Some(data)),
Err(e)=>{
let s=format!("{e:?}");
if s.contains("Requested asset is archived"){
println!("[combobulator] Asset {asset_id} is archived, skipping");
Ok(None)
}else if s.contains("User is not authorized to access Asset"){
println!("[combobulator] User is not authorized to access Asset {asset_id}, skipping");
Ok(None)
}else if s.contains("Asset is not approved for the requester"){
println!("[combobulator] Asset {asset_id} is not approved for the requester, skipping");
Ok(None)
}else if s.contains("Request asset was not found"){
println!("[combobulator] Asset {asset_id} was not found, skipping");
Ok(None)
}else{
Err(Error::DownloadAsset(e))
}
}
}
}
/// Process a single model: extract assets, cache to S3, build SNF.
async fn process_model(&self,asset_id:u64)->Result<(),Error>{
println!("[combobulator] Downloading model {asset_id}");
let rbxl_bytes=self.download_asset(asset_id).await?
.ok_or(Error::ArchivedModel)?;
// decode dom
let dom=map_tool::roblox::load_dom(&rbxl_bytes)
.map_err(Error::LoadDom)?;
// extract unique assets from the file
let assets=map_tool::roblox::get_unique_assets(&dom);
// place textures into 'loader'
let texture_loader=crate::loader::TextureLoader::new();
// process textures: download, cache, convert to DDS
let texture_loader=stream_iter(assets.textures).map(async|id|{
let asset_id=id.0;
let dds_key=S3Cache::texture_dds_key(asset_id);
// fetch cached DDS
let dds=if let Some(dds)=self.s3.get(&dds_key).await.map_err(Error::S3Get)?{
dds
}else{
// check raw cache, download if missing
let raw_key=S3Cache::texture_raw_key(asset_id);
let dds_result=if let Some(data)=self.s3.get(&raw_key).await.map_err(Error::S3Get)?{
map_tool::roblox::convert_texture_to_dds(&data)
}else{
println!("[combobulator] Downloading texture {asset_id}");
let Some(data)=self.download_asset(asset_id).await? else{
return Ok(None);
};
// decode while we have ownership
let dds_result=map_tool::roblox::convert_texture_to_dds(&data);
self.s3.put(&raw_key,data).await.map_err(Error::S3Put)?;
dds_result
};
// handle error after cacheing data
let dds=match dds_result{
Ok(dds)=>dds,
Err(e)=>{
println!("[combobulator] Texture {asset_id} convert error: {e}");
return Ok(None);
}
};
self.s3.put(&dds_key,dds.clone()).await.map_err(Error::S3Put)?;
dds
};
println!("[combobulator] Texture {asset_id} processed");
Ok(Some((id,dds)))
})
.buffer_unordered(CONCURRENT_REQUESTS)
.try_fold(texture_loader,async|mut texture_loader,maybe_loaded_texture|{
if let Some((id,dds))=maybe_loaded_texture{
texture_loader.insert(id,dds);
}
Ok(texture_loader)
}).await?;
let mesh_loader=crate::loader::MeshLoader::new();
// process meshes
let mesh_loader=stream_iter(assets.meshes).map(async|id|{
let asset_id=id.0;
let mesh_key=S3Cache::mesh_key(asset_id);
let mesh_result=if let Some(data)=self.s3.get(&mesh_key).await.map_err(Error::S3Get)?{
strafesnet_rbx_loader::mesh::convert(&data)
}else{
println!("[combobulator] Downloading mesh {asset_id}");
let Some(data)=self.download_asset(asset_id).await? else{
return Ok(None);
};
// decode while we have ownership
let mesh_result=strafesnet_rbx_loader::mesh::convert(&data);
self.s3.put(&mesh_key,data.clone()).await.map_err(Error::S3Put)?;
mesh_result
};
println!("[combobulator] Mesh {asset_id} processed");
// handle error after cacheing data
match mesh_result{
Ok(mesh)=>Ok(Some((id,mesh))),
Err(e)=>{
println!("[combobulator] Mesh {asset_id} convert error: {e}");
Ok(None)
},
}
})
.buffer_unordered(CONCURRENT_REQUESTS)
.try_fold(mesh_loader,async|mut mesh_loader,maybe_loaded_mesh|{
if let Some((id,mesh))=maybe_loaded_mesh{
mesh_loader.insert_mesh(id,mesh);
}
Ok(mesh_loader)
}).await?;
// process unions
let mesh_loader=stream_iter(assets.unions).map(async|id|{
let asset_id=id.0;
let union_key=S3Cache::union_key(asset_id);
let union_result=if let Some(data)=self.s3.get(&union_key).await.map_err(Error::S3Get)?{
rbx_binary::from_reader(data.as_slice())
}else{
println!("[combobulator] Downloading union {asset_id}");
let Some(data)=self.download_asset(asset_id).await? else{
return Ok(None);
};
// decode the data while we have ownership
let union_result=rbx_binary::from_reader(data.as_slice());
self.s3.put(&union_key,data).await.map_err(Error::S3Put)?;
union_result
};
println!("[combobulator] Union {asset_id} processed");
// handle error after cacheing data
match union_result{
Ok(union)=>Ok(Some((id,union))),
Err(e)=>{
println!("[combobulator] Union {asset_id} convert error: {e}");
Ok(None)
},
}
})
.buffer_unordered(CONCURRENT_REQUESTS)
.try_fold(mesh_loader,async|mut mesh_loader,maybe_loaded_union|{
if let Some((id,union))=maybe_loaded_union{
mesh_loader.insert_union(id,union);
}
Ok(mesh_loader)
}).await?;
// convert to SNF and upload
println!("[combobulator] Converting to SNF");
let snf=convert_to_snf(dom,mesh_loader,texture_loader)
.map_err(Error::ConvertSnf)?;
let snf_key=S3Cache::snf_key(asset_id);
self.s3.put(&snf_key,snf).await.map_err(Error::S3Put)?;
println!("[combobulator] SNF uploaded to {snf_key}");
Ok(())
}
/// Handle a mapfix release message.
pub async fn handle_mapfix_release(&self,request:ReleaseMapfixRequest)->Result<(),Error>{
println!("[combobulator] Processing mapfix {} (asset {})",
request.MapfixID,request.TargetAssetID);
self.process_model(request.TargetAssetID).await
}
/// Handle a seed request (reprocess an existing map).
pub async fn handle_seed(&self,request:crate::nats_types::SeedCombobulatorRequest)->Result<(),Error>{
println!("[combobulator] Seeding asset {}",request.AssetID);
self.process_model(request.AssetID).await
}
}

96
combobulator/src/s3.rs Normal file
View File

@@ -0,0 +1,96 @@
use aws_sdk_s3::Client;
use aws_sdk_s3::primitives::ByteStream;
#[expect(dead_code)]
#[derive(Debug)]
pub enum GetError{
Get(aws_sdk_s3::error::SdkError<aws_sdk_s3::operation::get_object::GetObjectError>),
Collect(aws_sdk_s3::primitives::ByteStreamError),
}
impl std::fmt::Display for GetError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for GetError{}
#[expect(dead_code)]
#[derive(Debug)]
pub enum PutError{
Put(aws_sdk_s3::error::SdkError<aws_sdk_s3::operation::put_object::PutObjectError>),
}
impl std::fmt::Display for PutError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"{self:?}")
}
}
impl std::error::Error for PutError{}
pub struct S3Cache{
client:Client,
bucket:String,
}
impl S3Cache{
pub fn new(client:Client,bucket:String)->Self{
Self{client,bucket}
}
/// Try to get a cached object. Returns None if the key doesn't exist.
pub async fn get(&self,key:&str)->Result<Option<Vec<u8>>,GetError>{
match self.client.get_object()
.bucket(&self.bucket)
.key(key)
.send()
.await
{
Ok(output)=>{
let bytes=output.body.collect().await.map_err(GetError::Collect)?;
Ok(Some(bytes.to_vec()))
},
Err(e)=>{
// check if it's a NoSuchKey error
if let aws_sdk_s3::error::SdkError::ServiceError(ref service_err)=e{
if service_err.err().is_no_such_key(){
return Ok(None);
}
}
Err(GetError::Get(e))
},
}
}
/// Put an object into S3.
pub async fn put(&self,key:&str,data:Vec<u8>)->Result<(),PutError>{
self.client.put_object()
.bucket(&self.bucket)
.key(key)
.body(ByteStream::from(data))
.send()
.await
.map_err(PutError::Put)?;
Ok(())
}
// S3 key helpers
pub fn texture_raw_key(asset_id:u64)->String{
format!("assets/textures/{asset_id}.raw")
}
pub fn texture_dds_key(asset_id:u64)->String{
format!("assets/textures/{asset_id}.dds")
}
pub fn mesh_key(asset_id:u64)->String{
format!("assets/meshes/{asset_id}")
}
pub fn union_key(asset_id:u64)->String{
format!("assets/unions/{asset_id}")
}
pub fn snf_key(model_id:u64)->String{
format!("maps/{model_id}.snfm")
}
}

View File

@@ -34,7 +34,7 @@ services:
"--data-rpc-host","dataservice:9000",
]
env_file:
- ~/auth-compose/strafesnet_staging.env
- /home/quat/auth-compose/strafesnet_staging.env
depends_on:
- authrpc
- nats
@@ -59,7 +59,7 @@ services:
maptest-validator
container_name: validation
env_file:
- ~/auth-compose/strafesnet_staging.env
- /home/quat/auth-compose/strafesnet_staging.env
environment:
- ROBLOX_GROUP_ID=17032139 # "None" is special case string value
- API_HOST_INTERNAL=http://submissions:8083/v1
@@ -105,7 +105,7 @@ services:
- REDIS_ADDR=authredis:6379
- RBX_GROUP_ID=17032139
env_file:
- ~/auth-compose/auth-service.env
- /home/quat/auth-compose/auth-service.env
depends_on:
- authredis
networks:
@@ -119,7 +119,7 @@ services:
environment:
- REDIS_ADDR=authredis:6379
env_file:
- ~/auth-compose/auth-service.env
- /home/quat/auth-compose/auth-service.env
depends_on:
- authredis
networks:

View File

@@ -115,6 +115,46 @@ const docTemplate = `{
}
}
}
},
"/map/{id}/snfm": {
"get": {
"security": [
{
"ApiKeyAuth": []
}
],
"description": "Redirects to a signed download URL for a map's SNFM file",
"tags": [
"maps"
],
"summary": "Download SNFM file",
"parameters": [
{
"type": "integer",
"description": "Map ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"307": {
"description": "Redirect to signed S3 URL"
},
"404": {
"description": "Map not found",
"schema": {
"$ref": "#/definitions/Error"
}
},
"default": {
"description": "General error response",
"schema": {
"$ref": "#/definitions/Error"
}
}
}
}
}
},
"definitions": {

View File

@@ -108,6 +108,46 @@
}
}
}
},
"/map/{id}/snfm": {
"get": {
"security": [
{
"ApiKeyAuth": []
}
],
"description": "Redirects to a signed download URL for a map's SNFM file",
"tags": [
"maps"
],
"summary": "Download SNFM file",
"parameters": [
{
"type": "integer",
"description": "Map ID",
"name": "id",
"in": "path",
"required": true
}
],
"responses": {
"307": {
"description": "Redirect to signed S3 URL"
},
"404": {
"description": "Map not found",
"schema": {
"$ref": "#/definitions/Error"
}
},
"default": {
"description": "General error response",
"schema": {
"$ref": "#/definitions/Error"
}
}
}
}
}
},
"definitions": {

View File

@@ -133,6 +133,31 @@ paths:
summary: Get map by ID
tags:
- maps
/map/{id}/snfm:
get:
description: Redirects to a signed download URL for a map's SNFM file
parameters:
- description: Map ID
in: path
name: id
required: true
type: integer
responses:
"307":
description: Redirect to signed S3 URL
"404":
description: Map not found
schema:
$ref: '#/definitions/Error'
default:
description: General error response
schema:
$ref: '#/definitions/Error'
security:
- ApiKeyAuth: []
summary: Download SNFM file
tags:
- maps
securityDefinitions:
ApiKeyAuth:
in: header

View File

@@ -1,4 +1,4 @@
package main
//go:generate swag init -g ./cmd/maps-service/service.go
//go:generate go run github.com/swaggo/swag/cmd/swag@latest init -g ./cmd/maps-service/service.go
//go:generate go run github.com/ogen-go/ogen/cmd/ogen@latest --target pkg/api --clean openapi.yaml

66
go.mod
View File

@@ -6,22 +6,23 @@ toolchain go1.24.5
require (
git.itzana.me/StrafesNET/dev-service v0.0.0-20250628052121-92af8193b5ed
git.itzana.me/strafesnet/go-grpc v0.0.0-20250815013325-1c84f73bdcb1
git.itzana.me/strafesnet/go-grpc v0.0.0-20260301211036-f2db3cb46e8c
git.itzana.me/strafesnet/utils v0.0.0-20220716194944-d8ca164052f9
github.com/dchest/siphash v1.2.3
github.com/gin-gonic/gin v1.10.1
github.com/go-faster/errors v0.7.1
github.com/go-faster/jx v1.1.0
github.com/go-faster/jx v1.2.0
github.com/nats-io/nats.go v1.37.0
github.com/ogen-go/ogen v1.2.1
github.com/ogen-go/ogen v1.18.0
github.com/redis/go-redis/v9 v9.10.0
github.com/sirupsen/logrus v1.9.3
github.com/swaggo/files v1.0.1
github.com/swaggo/gin-swagger v1.6.0
github.com/swaggo/swag v1.16.6
github.com/urfave/cli/v2 v2.27.6
go.opentelemetry.io/otel v1.32.0
go.opentelemetry.io/otel/metric v1.32.0
go.opentelemetry.io/otel/trace v1.32.0
go.opentelemetry.io/otel v1.40.0
go.opentelemetry.io/otel/metric v1.40.0
go.opentelemetry.io/otel/trace v1.40.0
google.golang.org/grpc v1.48.0
gorm.io/driver/postgres v1.6.0
gorm.io/gorm v1.25.12
@@ -31,11 +32,32 @@ require (
github.com/KyleBanks/depth v1.2.1 // indirect
github.com/PuerkitoBio/purell v1.1.1 // indirect
github.com/PuerkitoBio/urlesc v0.0.0-20170810143723-de5bf2ad4578 // indirect
github.com/aws/aws-sdk-go-v2 v1.41.2 // indirect
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.5 // indirect
github.com/aws/aws-sdk-go-v2/config v1.32.10 // indirect
github.com/aws/aws-sdk-go-v2/credentials v1.19.10 // indirect
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.18 // indirect
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.18 // indirect
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.18 // indirect
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4 // indirect
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.18 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.5 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.10 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.18 // indirect
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.18 // indirect
github.com/aws/aws-sdk-go-v2/service/s3 v1.96.2 // indirect
github.com/aws/aws-sdk-go-v2/service/signin v1.0.6 // indirect
github.com/aws/aws-sdk-go-v2/service/sso v1.30.11 // indirect
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.15 // indirect
github.com/aws/aws-sdk-go-v2/service/sts v1.41.7 // indirect
github.com/aws/smithy-go v1.24.1 // indirect
github.com/bytedance/sonic v1.11.6 // indirect
github.com/bytedance/sonic/loader v0.1.1 // indirect
github.com/cespare/xxhash/v2 v2.3.0 // indirect
github.com/cloudwego/base64x v0.1.4 // indirect
github.com/cloudwego/iasm v0.2.0 // indirect
github.com/cpuguy83/go-md2man/v2 v2.0.5 // indirect
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f // indirect
github.com/gabriel-vasile/mimetype v1.4.3 // indirect
github.com/gin-contrib/sse v0.1.0 // indirect
github.com/go-openapi/jsonpointer v0.19.5 // indirect
@@ -55,7 +77,7 @@ require (
github.com/jinzhu/now v1.1.5 // indirect
github.com/josharian/intern v1.0.0 // indirect
github.com/json-iterator/go v1.1.12 // indirect
github.com/klauspost/compress v1.17.6 // indirect
github.com/klauspost/compress v1.18.1 // indirect
github.com/klauspost/cpuid/v2 v2.2.7 // indirect
github.com/leodido/go-urn v1.4.0 // indirect
github.com/mailru/easyjson v0.7.6 // indirect
@@ -65,36 +87,38 @@ require (
github.com/nats-io/nuid v1.0.1 // indirect
github.com/pelletier/go-toml/v2 v2.2.2 // indirect
github.com/russross/blackfriday/v2 v2.1.0 // indirect
github.com/shopspring/decimal v1.4.0 // indirect
github.com/twitchyliquid64/golang-asm v0.15.1 // indirect
github.com/ugorji/go/codec v1.2.12 // indirect
github.com/xrash/smetrics v0.0.0-20240521201337-686a1a2994c1 // indirect
go.opentelemetry.io/auto/sdk v1.2.1 // indirect
golang.org/x/arch v0.8.0 // indirect
golang.org/x/crypto v0.32.0 // indirect
golang.org/x/mod v0.17.0 // indirect
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d // indirect
golang.org/x/crypto v0.46.0 // indirect
golang.org/x/mod v0.31.0 // indirect
golang.org/x/tools v0.40.0 // indirect
google.golang.org/genproto v0.0.0-20200526211855-cb27e3aa2013 // indirect
google.golang.org/protobuf v1.34.1 // indirect
gopkg.in/yaml.v3 v3.0.1 // indirect
)
require (
github.com/dlclark/regexp2 v1.11.0 // indirect
github.com/fatih/color v1.17.0 // indirect
github.com/dlclark/regexp2 v1.11.5 // indirect
github.com/fatih/color v1.18.0 // indirect
github.com/ghodss/yaml v1.0.0 // indirect
github.com/go-faster/yaml v0.4.6 // indirect
github.com/go-logr/logr v1.4.2 // indirect
github.com/go-logr/logr v1.4.3 // indirect
github.com/go-logr/stdr v1.2.2 // indirect
// github.com/golang/protobuf v1.5.4 // indirect
github.com/google/uuid v1.6.0 // indirect
github.com/mattn/go-colorable v0.1.13 // indirect
github.com/mattn/go-colorable v0.1.14 // indirect
github.com/mattn/go-isatty v0.0.20 // indirect
github.com/segmentio/asm v1.2.0 // indirect
github.com/segmentio/asm v1.2.1 // indirect
go.uber.org/multierr v1.11.0 // indirect
go.uber.org/zap v1.27.0 // indirect
golang.org/x/exp v0.0.0-20240531132922-fd00a4e0eefc // indirect
golang.org/x/net v0.34.0 // indirect
golang.org/x/sync v0.12.0 // indirect
golang.org/x/sys v0.29.0 // indirect
golang.org/x/text v0.23.0 // indirect
go.uber.org/zap v1.27.1 // indirect
golang.org/x/exp v0.0.0-20251219203646-944ab1f22d93 // indirect
golang.org/x/net v0.48.0 // indirect
golang.org/x/sync v0.19.0 // indirect
golang.org/x/sys v0.39.0 // indirect
golang.org/x/text v0.32.0 // indirect
gopkg.in/yaml.v2 v2.4.0 // indirect
)

155
go.sum
View File

@@ -2,8 +2,12 @@ cloud.google.com/go v0.26.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMT
cloud.google.com/go v0.34.0/go.mod h1:aQUYkXzVsufM+DwF1aE+0xfcU+56JwCaLick0ClmMTw=
git.itzana.me/StrafesNET/dev-service v0.0.0-20250628052121-92af8193b5ed h1:eGWIQx2AOrSsLC2dieuSs8MCliRE60tvpZnmxsTBtKc=
git.itzana.me/StrafesNET/dev-service v0.0.0-20250628052121-92af8193b5ed/go.mod h1:KJal0K++M6HEzSry6JJ2iDPZtOQn5zSstNlDbU3X4Jg=
git.itzana.me/strafesnet/go-grpc v0.0.0-20250815013325-1c84f73bdcb1 h1:imXibfeYcae6og0TTDUFRQ3CQtstGjIoLbCn+pezD2o=
git.itzana.me/strafesnet/go-grpc v0.0.0-20250815013325-1c84f73bdcb1/go.mod h1:X7XTRUScRkBWq8q8bplbeso105RPDlnY7J6Wy1IwBMs=
git.itzana.me/strafesnet/go-grpc v0.0.0-20251228204118-c20dbb42afec h1:JSar9If1kzb02+Erp+zmSqHKWPPP2NqMQVK15pRmkLE=
git.itzana.me/strafesnet/go-grpc v0.0.0-20251228204118-c20dbb42afec/go.mod h1:X7XTRUScRkBWq8q8bplbeso105RPDlnY7J6Wy1IwBMs=
git.itzana.me/strafesnet/go-grpc v0.0.0-20260301210537-0bea64387f6d h1:I73hWqmIcsSH90VHjwsg50v6emQkM0IAA04vb4wktBA=
git.itzana.me/strafesnet/go-grpc v0.0.0-20260301210537-0bea64387f6d/go.mod h1:X7XTRUScRkBWq8q8bplbeso105RPDlnY7J6Wy1IwBMs=
git.itzana.me/strafesnet/go-grpc v0.0.0-20260301211036-f2db3cb46e8c h1:sI50ymozoI+HFbxg1AOdCeWF6bJgpeP6OrnCvyjuQ9U=
git.itzana.me/strafesnet/go-grpc v0.0.0-20260301211036-f2db3cb46e8c/go.mod h1:X7XTRUScRkBWq8q8bplbeso105RPDlnY7J6Wy1IwBMs=
git.itzana.me/strafesnet/utils v0.0.0-20220716194944-d8ca164052f9 h1:7lU6jyR7S7Rhh1dnUp7GyIRHUTBXZagw8F4n4hOyxLw=
git.itzana.me/strafesnet/utils v0.0.0-20220716194944-d8ca164052f9/go.mod h1:uyYerSieEt4v0MJCdPLppG0LtJ4Yj035vuTetWGsxjY=
github.com/BurntSushi/toml v0.3.1/go.mod h1:xHWCNGjB5oqiDr8zfno3MHue2Ht5sIBksp03qcyfWMU=
@@ -14,12 +18,56 @@ github.com/PuerkitoBio/purell v1.1.1/go.mod h1:c11w/QuzBsJSee3cPx9rAFu61PvFxuPbt
github.com/PuerkitoBio/urlesc v0.0.0-20170810143723-de5bf2ad4578 h1:d+Bc7a5rLufV/sSk/8dngufqelfh6jnri85riMAaF/M=
github.com/PuerkitoBio/urlesc v0.0.0-20170810143723-de5bf2ad4578/go.mod h1:uGdkoq3SwY9Y+13GIhn11/XLaGBb4BfwItxLd5jeuXE=
github.com/antihax/optional v1.0.0/go.mod h1:uupD/76wgC+ih3iEmQUL+0Ugr19nfwCT1kdvxnR2qWY=
github.com/aws/aws-sdk-go-v2 v1.41.2 h1:LuT2rzqNQsauaGkPK/7813XxcZ3o3yePY0Iy891T2ls=
github.com/aws/aws-sdk-go-v2 v1.41.2/go.mod h1:IvvlAZQXvTXznUPfRVfryiG1fbzE2NGK6m9u39YQ+S4=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.5 h1:zWFmPmgw4sveAYi1mRqG+E/g0461cJ5M4bJ8/nc6d3Q=
github.com/aws/aws-sdk-go-v2/aws/protocol/eventstream v1.7.5/go.mod h1:nVUlMLVV8ycXSb7mSkcNu9e3v/1TJq2RTlrPwhYWr5c=
github.com/aws/aws-sdk-go-v2/config v1.32.10 h1:9DMthfO6XWZYLfzZglAgW5Fyou2nRI5CuV44sTedKBI=
github.com/aws/aws-sdk-go-v2/config v1.32.10/go.mod h1:2rUIOnA2JaiqYmSKYmRJlcMWy6qTj1vuRFscppSBMcw=
github.com/aws/aws-sdk-go-v2/credentials v1.19.10 h1:EEhmEUFCE1Yhl7vDhNOI5OCL/iKMdkkYFTRpZXNw7m8=
github.com/aws/aws-sdk-go-v2/credentials v1.19.10/go.mod h1:RnnlFCAlxQCkN2Q379B67USkBMu1PipEEiibzYN5UTE=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.18 h1:Ii4s+Sq3yDfaMLpjrJsqD6SmG/Wq/P5L/hw2qa78UAY=
github.com/aws/aws-sdk-go-v2/feature/ec2/imds v1.18.18/go.mod h1:6x81qnY++ovptLE6nWQeWrpXxbnlIex+4H4eYYGcqfc=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.18 h1:F43zk1vemYIqPAwhjTjYIz0irU2EY7sOb/F5eJ3HuyM=
github.com/aws/aws-sdk-go-v2/internal/configsources v1.4.18/go.mod h1:w1jdlZXrGKaJcNoL+Nnrj+k5wlpGXqnNrKoP22HvAug=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.18 h1:xCeWVjj0ki0l3nruoyP2slHsGArMxeiiaoPN5QZH6YQ=
github.com/aws/aws-sdk-go-v2/internal/endpoints/v2 v2.7.18/go.mod h1:r/eLGuGCBw6l36ZRWiw6PaZwPXb6YOj+i/7MizNl5/k=
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4 h1:WKuaxf++XKWlHWu9ECbMlha8WOEGm0OUEZqm4K/Gcfk=
github.com/aws/aws-sdk-go-v2/internal/ini v1.8.4/go.mod h1:ZWy7j6v1vWGmPReu0iSGvRiise4YI5SkR3OHKTZ6Wuc=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.18 h1:eZioDaZGJ0tMM4gzmkNIO2aAoQd+je7Ug7TkvAzlmkU=
github.com/aws/aws-sdk-go-v2/internal/v4a v1.4.18/go.mod h1:CCXwUKAJdoWr6/NcxZ+zsiPr6oH/Q5aTooRGYieAyj4=
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.5 h1:CeY9LUdur+Dxoeldqoun6y4WtJ3RQtzk0JMP2gfUay0=
github.com/aws/aws-sdk-go-v2/service/internal/accept-encoding v1.13.5/go.mod h1:AZLZf2fMaahW5s/wMRciu1sYbdsikT/UHwbUjOdEVTc=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.10 h1:fJvQ5mIBVfKtiyx0AHY6HeWcRX5LGANLpq8SVR+Uazs=
github.com/aws/aws-sdk-go-v2/service/internal/checksum v1.9.10/go.mod h1:Kzm5e6OmNH8VMkgK9t+ry5jEih4Y8whqs+1hrkxim1I=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.18 h1:LTRCYFlnnKFlKsyIQxKhJuDuA3ZkrDQMRYm6rXiHlLY=
github.com/aws/aws-sdk-go-v2/service/internal/presigned-url v1.13.18/go.mod h1:XhwkgGG6bHSd00nO/mexWTcTjgd6PjuvWQMqSn2UaEk=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.18 h1:/A/xDuZAVD2BpsS2fftFRo/NoEKQJ8YTnJDEHBy2Gtg=
github.com/aws/aws-sdk-go-v2/service/internal/s3shared v1.19.18/go.mod h1:hWe9b4f+djUQGmyiGEeOnZv69dtMSgpDRIvNMvuvzvY=
github.com/aws/aws-sdk-go-v2/service/s3 v1.96.2 h1:M1A9AjcFwlxTLuf0Faj88L8Iqw0n/AJHjpZTQzMMsSc=
github.com/aws/aws-sdk-go-v2/service/s3 v1.96.2/go.mod h1:KsdTV6Q9WKUZm2mNJnUFmIoXfZux91M3sr/a4REX8e0=
github.com/aws/aws-sdk-go-v2/service/signin v1.0.6 h1:MzORe+J94I+hYu2a6XmV5yC9huoTv8NRcCrUNedDypQ=
github.com/aws/aws-sdk-go-v2/service/signin v1.0.6/go.mod h1:hXzcHLARD7GeWnifd8j9RWqtfIgxj4/cAtIVIK7hg8g=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.11 h1:7oGD8KPfBOJGXiCoRKrrrQkbvCp8N++u36hrLMPey6o=
github.com/aws/aws-sdk-go-v2/service/sso v1.30.11/go.mod h1:0DO9B5EUJQlIDif+XJRWCljZRKsAFKh3gpFz7UnDtOo=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.15 h1:edCcNp9eGIUDUCrzoCu1jWAXLGFIizeqkdkKgRlJwWc=
github.com/aws/aws-sdk-go-v2/service/ssooidc v1.35.15/go.mod h1:lyRQKED9xWfgkYC/wmmYfv7iVIM68Z5OQ88ZdcV1QbU=
github.com/aws/aws-sdk-go-v2/service/sts v1.41.7 h1:NITQpgo9A5NrDZ57uOWj+abvXSb83BbyggcUBVksN7c=
github.com/aws/aws-sdk-go-v2/service/sts v1.41.7/go.mod h1:sks5UWBhEuWYDPdwlnRFn1w7xWdH29Jcpe+/PJQefEs=
github.com/aws/smithy-go v1.24.1 h1:VbyeNfmYkWoxMVpGUAbQumkODcYmfMRfZ8yQiH30SK0=
github.com/aws/smithy-go v1.24.1/go.mod h1:LEj2LM3rBRQJxPZTB4KuzZkaZYnZPnvgIhb4pu07mx0=
github.com/bsm/ginkgo/v2 v2.12.0 h1:Ny8MWAHyOepLGlLKYmXG4IEkioBysk6GpaRTLC8zwWs=
github.com/bsm/ginkgo/v2 v2.12.0/go.mod h1:SwYbGRRDovPVboqFv0tPTcG1sN61LM1Z4ARdbAV9g4c=
github.com/bsm/gomega v1.27.10 h1:yeMWxP2pV2fG3FgAODIY8EiRE3dy0aeFYt4l7wh6yKA=
github.com/bsm/gomega v1.27.10/go.mod h1:JyEr/xRbxbtgWNi8tIEVPUYZ5Dzef52k01W3YH0H+O0=
github.com/bytedance/sonic v1.11.6 h1:oUp34TzMlL+OY1OUWxHqsdkgC/Zfc85zGqw9siXjrc0=
github.com/bytedance/sonic v1.11.6/go.mod h1:LysEHSvpvDySVdC2f87zGWf6CIKJcAvqab1ZaiQtds4=
github.com/bytedance/sonic/loader v0.1.1 h1:c+e5Pt1k/cy5wMveRDyk2X4B9hF4g7an8N3zCYjJFNM=
github.com/bytedance/sonic/loader v0.1.1/go.mod h1:ncP89zfokxS5LZrJxl5z0UJcsk4M4yY2JpfqGeCtNLU=
github.com/census-instrumentation/opencensus-proto v0.2.1/go.mod h1:f6KPmirojxKA12rnyqOA5BBL4O983OfeGPqjHWSTneU=
github.com/cespare/xxhash/v2 v2.1.1/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/cespare/xxhash/v2 v2.3.0 h1:UL815xU9SqsFlibzuggzjXhog7bL6oX9BbNZnL2UFvs=
github.com/cespare/xxhash/v2 v2.3.0/go.mod h1:VGX0DQ3Q6kWi7AoAeZDth3/j3BFtOZR5XLFGgcrjCOs=
github.com/client9/misspell v0.3.4/go.mod h1:qj6jICC3Q7zFZvVWo7KLAzC3yx5G7kyvSDkc90ppPyw=
github.com/cloudwego/base64x v0.1.4 h1:jwCgWpFanWmN8xoIUHa2rtzmkd5J2plF/dnLS6Xd/0Y=
github.com/cloudwego/base64x v0.1.4/go.mod h1:0zlkT4Wn5C6NdauXdJRhSKRlJvmclQ1hhJgA0rcu/8w=
@@ -39,16 +87,18 @@ github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dchest/siphash v1.2.3 h1:QXwFc8cFOR2dSa/gE6o/HokBMWtLUaNDVd+22aKHeEA=
github.com/dchest/siphash v1.2.3/go.mod h1:0NvQU092bT0ipiFN++/rXm69QG9tVxLAlQHIXMPAkHc=
github.com/dlclark/regexp2 v1.11.0 h1:G/nrcoOa7ZXlpoa/91N3X7mM3r8eIlMBBJZvsz/mxKI=
github.com/dlclark/regexp2 v1.11.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f h1:lO4WD4F/rVNCu3HqELle0jiPLLBs70cWOduZpkS1E78=
github.com/dgryski/go-rendezvous v0.0.0-20200823014737-9f7001d12a5f/go.mod h1:cuUVRXasLTGF7a8hSLbxyZXjz+1KgoB3wDUb6vlszIc=
github.com/dlclark/regexp2 v1.11.5 h1:Q/sSnsKerHeCkc/jSTNq1oCm7KiVgUMZRDUoRu0JQZQ=
github.com/dlclark/regexp2 v1.11.5/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
github.com/envoyproxy/go-control-plane v0.9.0/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.1-0.20191026205805-5f8ba28d4473/go.mod h1:YTl/9mNaCwkRvm6d1a2C3ymFceY/DCBVvsKhRF0iEA4=
github.com/envoyproxy/go-control-plane v0.9.4/go.mod h1:6rpuAdCZL397s3pYoYcLgu1mIlRU8Am5FuJP05cCM98=
github.com/envoyproxy/go-control-plane v0.9.9-0.20201210154907-fd9021fe5dad/go.mod h1:cXg6YxExXjJnVBQHBLXeUAgxn2UodCpnH306RInaBQk=
github.com/envoyproxy/go-control-plane v0.10.2-0.20220325020618-49ff273808a1/go.mod h1:KJwIaB5Mv44NWtYuAOFCVOjcI94vtpEz2JU/D2v6IjE=
github.com/envoyproxy/protoc-gen-validate v0.1.0/go.mod h1:iSmxcyjqTsJpI2R4NaDN7+kN2VEUnK/pcBlmesArF7c=
github.com/fatih/color v1.17.0 h1:GlRw1BRJxkpqUCBKzKOw098ed57fEsKeNjpTe3cSjK4=
github.com/fatih/color v1.17.0/go.mod h1:YZ7TlrGPkiz6ku9fK3TLD/pl3CpsiFyu8N92HLgmosI=
github.com/fatih/color v1.18.0 h1:S8gINlzdQ840/4pfAwic/ZE0djQEH3wM94VfqLTZcOM=
github.com/fatih/color v1.18.0/go.mod h1:4FelSpRwEGDpQ12mAdzqdOukCy4u8WUtOY6lkT/6HfU=
github.com/gabriel-vasile/mimetype v1.4.3 h1:in2uUcidCuFcDKtdcBxlR0rJ1+fsokWf+uqxgUFjbI0=
github.com/gabriel-vasile/mimetype v1.4.3/go.mod h1:d8uq/6HKRL6CGdk+aubisF/M5GcPfT7nKyLpA0lbSSk=
github.com/ghodss/yaml v1.0.0 h1:wQHKEahhL6wmXdzwWG11gIVCkOv05bNOh+Rxn0yngAk=
@@ -61,13 +111,13 @@ github.com/gin-gonic/gin v1.10.1 h1:T0ujvqyCSqRopADpgPgiTT63DUQVSfojyME59Ei63pQ=
github.com/gin-gonic/gin v1.10.1/go.mod h1:4PMNQiOhvDRa013RKVbsiNwoyezlm2rm0uX/T7kzp5Y=
github.com/go-faster/errors v0.7.1 h1:MkJTnDoEdi9pDabt1dpWf7AA8/BaSYZqibYyhZ20AYg=
github.com/go-faster/errors v0.7.1/go.mod h1:5ySTjWFiphBs07IKuiL69nxdfd5+fzh1u7FPGZP2quo=
github.com/go-faster/jx v1.1.0 h1:ZsW3wD+snOdmTDy9eIVgQdjUpXRRV4rqW8NS3t+20bg=
github.com/go-faster/jx v1.1.0/go.mod h1:vKDNikrKoyUmpzaJ0OkIkRQClNHFX/nF3dnTJZb3skg=
github.com/go-faster/jx v1.2.0 h1:T2YHJPrFaYu21fJtUxC9GzmluKu8rVIFDwwGBKTDseI=
github.com/go-faster/jx v1.2.0/go.mod h1:UWLOVDmMG597a5tBFPLIWJdUxz5/2emOpfsj9Neg0PE=
github.com/go-faster/yaml v0.4.6 h1:lOK/EhI04gCpPgPhgt0bChS6bvw7G3WwI8xxVe0sw9I=
github.com/go-faster/yaml v0.4.6/go.mod h1:390dRIvV4zbnO7qC9FGo6YYutc+wyyUSHBgbXL52eXk=
github.com/go-logr/logr v1.2.2/go.mod h1:jdQByPbusPIv2/zmleS9BjJVeZ6kBagPoEUsqbVz/1A=
github.com/go-logr/logr v1.4.2 h1:6pFjapn8bFcIbiKo3XT4j/BhANplGihG6tvd+8rYgrY=
github.com/go-logr/logr v1.4.2/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-logr/logr v1.4.3 h1:CjnDlHq8ikf6E492q6eKboGOC0T8CDaOvkHCIg8idEI=
github.com/go-logr/logr v1.4.3/go.mod h1:9T104GzyrTigFIr8wt5mBrctHMim0Nb2HLGrmQ40KvY=
github.com/go-logr/stdr v1.2.2 h1:hSWxHoqTgW2S2qGc0LTAI563KZ5YKYRhT3MFKZMbjag=
github.com/go-logr/stdr v1.2.2/go.mod h1:mMo/vtBO5dYbehREoey6XUKy/eSumjCCveDpRre4VKE=
github.com/go-openapi/jsonpointer v0.19.3/go.mod h1:Pl9vOtqEWErmShwVjC8pYs9cog34VGT37dQOVbmoatg=
@@ -113,8 +163,8 @@ github.com/google/go-cmp v0.4.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/
github.com/google/go-cmp v0.5.0/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.5/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.5.6/go.mod h1:v8dTdLbMG2kIc/vJvl+f65V22dbkXbowE6jgT/gNBxE=
github.com/google/go-cmp v0.6.0 h1:ofyhxvXcZhMsU5ulbFiLKl/XBFqE1GSq7atu8tAmTRI=
github.com/google/go-cmp v0.6.0/go.mod h1:17dUlkBOakJ0+DkrSSNjCkIjxS6bF9zb3elmeNGIjoY=
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
github.com/google/gofuzz v1.0.0/go.mod h1:dBl0BpW6vV/+mYPU4Po3pmUjxk6FQPldtuIdl/M65Eg=
github.com/google/uuid v1.1.2/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
github.com/google/uuid v1.3.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
@@ -138,8 +188,8 @@ github.com/josharian/intern v1.0.0 h1:vlS4z54oSdjm0bgjRigI+G1HpF+tI+9rE5LLzOg8Hm
github.com/josharian/intern v1.0.0/go.mod h1:5DoeVV0s6jJacbCEi61lwdGj/aVlrQvzHFFd8Hwg//Y=
github.com/json-iterator/go v1.1.12 h1:PV8peI4a0ysnczrg+LtxykD8LfKY9ML6u2jnxaEnrnM=
github.com/json-iterator/go v1.1.12/go.mod h1:e30LSqwooZae/UwlEbR2852Gd8hjQvJoHmT4TnhNGBo=
github.com/klauspost/compress v1.17.6 h1:60eq2E/jlfwQXtvZEeBUYADs+BwKBWURIY+Gj2eRGjI=
github.com/klauspost/compress v1.17.6/go.mod h1:/dCuZOvVtNoHsyb+cuJD3itjs3NbnF6KH9zAO4BDxPM=
github.com/klauspost/compress v1.18.1 h1:bcSGx7UbpBqMChDtsF28Lw6v/G94LPrrbMbdC3JH2co=
github.com/klauspost/compress v1.18.1/go.mod h1:ZQFFVG+MdnR0P+l6wpXgIL4NTtwiKIdBnrBd8Nrxr+0=
github.com/klauspost/cpuid/v2 v2.0.9/go.mod h1:FInQzS24/EEf25PyTYn52gqo7WaD8xa0213Md/qVLRg=
github.com/klauspost/cpuid/v2 v2.2.7 h1:ZWSB3igEs+d0qvnxR/ZBzXVmxkgt8DdzP6m9pfuVLDM=
github.com/klauspost/cpuid/v2 v2.2.7/go.mod h1:Lcz8mBdAVJIBVzewtcLocK12l3Y+JytZYpaMropDUws=
@@ -157,9 +207,8 @@ github.com/mailru/easyjson v0.0.0-20190614124828-94de47d64c63/go.mod h1:C1wdFJiN
github.com/mailru/easyjson v0.0.0-20190626092158-b2ccc519800e/go.mod h1:C1wdFJiN94OJF2b5HbByQZoLdCWB1Yqtg26g4irojpc=
github.com/mailru/easyjson v0.7.6 h1:8yTIVnZgCoiM1TgqoeTl+LfU5Jg6/xL3QhGQnimLYnA=
github.com/mailru/easyjson v0.7.6/go.mod h1:xzfreul335JAWq5oZzymOObrkdz5UnU4kGfJJLY9Nlc=
github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
github.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=
github.com/mattn/go-colorable v0.1.14/go.mod h1:6LmQG8QLFO4G5z1gPvYEzlUgJ2wF+stgPZH1UqBm1s8=
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
github.com/modern-go/concurrent v0.0.0-20180228061459-e0a39a4cb421/go.mod h1:6dJC0mAP4ikYIbvyc7fijjWJddQyLn8Ig3JB5CqoB9Q=
@@ -174,20 +223,24 @@ github.com/nats-io/nkeys v0.4.7/go.mod h1:kqXRgRDPlGy7nGaEDMuYzmiJCIAAWDK0IMBtDm
github.com/nats-io/nuid v1.0.1 h1:5iA8DT8V7q8WK2EScv2padNa/rTESc1KdnPw4TC2paw=
github.com/nats-io/nuid v1.0.1/go.mod h1:19wcPz3Ph3q0Jbyiqsd0kePYG7A95tJPxeL+1OSON2c=
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
github.com/ogen-go/ogen v1.2.1 h1:C5A0lvUMu2wl+eWIxnpXMWnuOJ26a2FyzR1CIC2qG0M=
github.com/ogen-go/ogen v1.2.1/go.mod h1:P2zQdEu8UqaVRfD5GEFvl+9q63VjMLvDquq1wVbyInM=
github.com/ogen-go/ogen v1.18.0 h1:6RQ7lFBjOeNaUWu4getfqIh4GJbEY4hqKuzDtec/g60=
github.com/ogen-go/ogen v1.18.0/go.mod h1:dHFr2Wf6cA7tSxMI+zPC21UR5hAlDw8ZYUkK3PziURY=
github.com/pelletier/go-toml/v2 v2.2.2 h1:aYUidT7k73Pcl9nb2gScu7NSrKCSHIDE89b3+6Wq+LM=
github.com/pelletier/go-toml/v2 v2.2.2/go.mod h1:1t835xjRzz80PqgE6HHgN2JOsmgYu/h4qDAS4n929Rs=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/prometheus/client_model v0.0.0-20190812154241-14fe0d1b01d4/go.mod h1:xMI15A0UPsDsEKsMN9yxemIoYk6Tm2C1GtYGdfGttqA=
github.com/redis/go-redis/v9 v9.10.0 h1:FxwK3eV8p/CQa0Ch276C7u2d0eNC9kCmAYQ7mCXCzVs=
github.com/redis/go-redis/v9 v9.10.0/go.mod h1:huWgSWd8mW6+m0VPhJjSSQ+d6Nh1VICQ6Q5lHuCH/Iw=
github.com/rogpeppe/fastuuid v1.2.0/go.mod h1:jVj6XXZzXRy/MSR5jhDC/2q6DgLz+nrA6LYCDYWNEvQ=
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/segmentio/asm v1.2.0 h1:9BQrFxC+YOHJlTlHGkTrFWf59nbL3XnCoFLTwDCI7ys=
github.com/segmentio/asm v1.2.0/go.mod h1:BqMnlJP91P8d+4ibuonYZw9mfnzI9HfxselHZr5aAcs=
github.com/segmentio/asm v1.2.1 h1:DTNbBqs57ioxAD4PrArqftgypG4/qNpXoJx8TVXxPR0=
github.com/segmentio/asm v1.2.1/go.mod h1:BqMnlJP91P8d+4ibuonYZw9mfnzI9HfxselHZr5aAcs=
github.com/shopspring/decimal v1.4.0 h1:bxl37RwXBklmTi0C79JfXCEBD1cqqHt0bbgBAGFp81k=
github.com/shopspring/decimal v1.4.0/go.mod h1:gawqmDU56v4yIKSwfBSFip1HdCCXN8/+DMd9qYNcwME=
github.com/sirupsen/logrus v1.8.1/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
github.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=
github.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=
@@ -204,8 +257,9 @@ github.com/stretchr/testify v1.7.1/go.mod h1:6Fq8oRcR53rry900zMqJjRRixrwX3KX962/
github.com/stretchr/testify v1.8.0/go.mod h1:yNjHg4UonilssWZ8iaSj1OCr/vHnekPRkoO+kdMU+MU=
github.com/stretchr/testify v1.8.1/go.mod h1:w2LPCIKwWwSfY2zedu0+kehJoqGctiVI29o6fzry7u4=
github.com/stretchr/testify v1.8.4/go.mod h1:sz/lmYIOXD/1dqDmKjjqLyZ2RngseejIcXlSw2iwfAo=
github.com/stretchr/testify v1.9.0 h1:HtqpIVDClZ4nwg75+f6Lvsy/wHu+3BoSGCbBAcpTsTg=
github.com/stretchr/testify v1.9.0/go.mod h1:r2ic/lqez/lEtzL7wO/rwa5dbSLXVDPFyf8C91i36aY=
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
github.com/swaggo/files v1.0.1 h1:J1bVJ4XHZNq0I46UU90611i9/YzdrF7x92oX1ig5IdE=
github.com/swaggo/files v1.0.1/go.mod h1:0qXmMNH6sXNf+73t65aKeB+ApmgxdnkQzVTAj2uaMUg=
github.com/swaggo/gin-swagger v1.6.0 h1:y8sxvQ3E20/RCyrXeFfg60r6H0Z+SwpTjMYsMm+zy8M=
@@ -221,36 +275,44 @@ github.com/urfave/cli/v2 v2.27.6/go.mod h1:3Sevf16NykTbInEnD0yKkjDAeZDS0A6bzhBH5
github.com/xrash/smetrics v0.0.0-20240521201337-686a1a2994c1 h1:gEOO8jv9F4OT7lGCjxCBTO/36wtF6j2nSip77qHd4x4=
github.com/xrash/smetrics v0.0.0-20240521201337-686a1a2994c1/go.mod h1:Ohn+xnUBiLI6FVj/9LpzZWtj1/D6lUovWYBkxHVV3aM=
github.com/yuin/goldmark v1.4.13/go.mod h1:6yULJ656Px+3vBD8DxQVa3kxgyrAnzto9xy5taEt/CY=
go.opentelemetry.io/otel v1.32.0 h1:WnBN+Xjcteh0zdk01SVqV55d/m62NJLJdIyb4y/WO5U=
go.opentelemetry.io/otel v1.32.0/go.mod h1:00DCVSB0RQcnzlwyTfqtxSm+DRr9hpYrHjNGiBHVQIg=
go.opentelemetry.io/otel/metric v1.32.0 h1:xV2umtmNcThh2/a/aCP+h64Xx5wsj8qqnkYZktzNa0M=
go.opentelemetry.io/otel/metric v1.32.0/go.mod h1:jH7CIbbK6SH2V2wE16W05BHCtIDzauciCRLoc/SyMv8=
go.opentelemetry.io/otel/trace v1.32.0 h1:WIC9mYrXf8TmY/EXuULKc8hR17vE+Hjv2cssQDe03fM=
go.opentelemetry.io/otel/trace v1.32.0/go.mod h1:+i4rkvCraA+tG6AzwloGaCtkx53Fa+L+V8e9a7YvhT8=
go.opentelemetry.io/auto/sdk v1.2.1 h1:jXsnJ4Lmnqd11kwkBV2LgLoFMZKizbCi5fNZ/ipaZ64=
go.opentelemetry.io/auto/sdk v1.2.1/go.mod h1:KRTj+aOaElaLi+wW1kO/DZRXwkF4C5xPbEe3ZiIhN7Y=
go.opentelemetry.io/otel v1.39.0 h1:8yPrr/S0ND9QEfTfdP9V+SiwT4E0G7Y5MO7p85nis48=
go.opentelemetry.io/otel v1.39.0/go.mod h1:kLlFTywNWrFyEdH0oj2xK0bFYZtHRYUdv1NklR/tgc8=
go.opentelemetry.io/otel v1.40.0 h1:oA5YeOcpRTXq6NN7frwmwFR0Cn3RhTVZvXsP4duvCms=
go.opentelemetry.io/otel v1.40.0/go.mod h1:IMb+uXZUKkMXdPddhwAHm6UfOwJyh4ct1ybIlV14J0g=
go.opentelemetry.io/otel/metric v1.39.0 h1:d1UzonvEZriVfpNKEVmHXbdf909uGTOQjA0HF0Ls5Q0=
go.opentelemetry.io/otel/metric v1.39.0/go.mod h1:jrZSWL33sD7bBxg1xjrqyDjnuzTUB0x1nBERXd7Ftcs=
go.opentelemetry.io/otel/metric v1.40.0 h1:rcZe317KPftE2rstWIBitCdVp89A2HqjkxR3c11+p9g=
go.opentelemetry.io/otel/metric v1.40.0/go.mod h1:ib/crwQH7N3r5kfiBZQbwrTge743UDc7DTFVZrrXnqc=
go.opentelemetry.io/otel/trace v1.39.0 h1:2d2vfpEDmCJ5zVYz7ijaJdOF59xLomrvj7bjt6/qCJI=
go.opentelemetry.io/otel/trace v1.39.0/go.mod h1:88w4/PnZSazkGzz/w84VHpQafiU4EtqqlVdxWy+rNOA=
go.opentelemetry.io/otel/trace v1.40.0 h1:WA4etStDttCSYuhwvEa8OP8I5EWu24lkOzp+ZYblVjw=
go.opentelemetry.io/otel/trace v1.40.0/go.mod h1:zeAhriXecNGP/s2SEG3+Y8X9ujcJOTqQ5RgdEJcawiA=
go.opentelemetry.io/proto/otlp v0.7.0/go.mod h1:PqfVotwruBrMGOCsRd/89rSnXhoiJIqeYNgFYFoEGnI=
go.uber.org/goleak v1.3.0 h1:2K3zAYmnTNqV73imy9J1T3WC+gmCePx2hEGkimedGto=
go.uber.org/goleak v1.3.0/go.mod h1:CoHD4mav9JJNrW/WLlf7HGZPjdw8EucARQHekz1X6bE=
go.uber.org/multierr v1.11.0 h1:blXXJkSxSSfBVBlC76pxqeO+LN3aDfLQo+309xJstO0=
go.uber.org/multierr v1.11.0/go.mod h1:20+QtiLqy0Nd6FdQB9TLXag12DsQkrbs3htMFfDN80Y=
go.uber.org/zap v1.27.0 h1:aJMhYGrd5QSmlpLMr2MftRKl7t8J8PTZPA732ud/XR8=
go.uber.org/zap v1.27.0/go.mod h1:GB2qFLM7cTU87MWRP2mPIjqfIDnGu+VIO4V/SdhGo2E=
go.uber.org/zap v1.27.1 h1:08RqriUEv8+ArZRYSTXy1LeBScaMpVSTBhCeaZYfMYc=
go.uber.org/zap v1.27.1/go.mod h1:GB2qFLM7cTU87MWRP2mPIjqfIDnGu+VIO4V/SdhGo2E=
golang.org/x/arch v0.0.0-20210923205945-b76863e36670/go.mod h1:5om86z9Hs0C8fWVUuoMHwpExlXzs5Tkyp9hOrfG7pp8=
golang.org/x/arch v0.8.0 h1:3wRIsP3pM4yUptoR96otTUOXI367OS0+c9eeRi9doIc=
golang.org/x/arch v0.8.0/go.mod h1:FEVrYAQjsQXMVJ1nsMoVVXPZg6p2JE2mx8psSWTDQys=
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
golang.org/x/crypto v0.0.0-20200622213623-75b288015ac9/go.mod h1:LzIPMQfyMNhhGPhUkYOs5KpL4U8rLKemX1yGLhDgUto=
golang.org/x/crypto v0.0.0-20210921155107-089bfa567519/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
golang.org/x/crypto v0.32.0 h1:euUpcYgM8WcP71gNpTqQCn6rC2t6ULUPiOzfWaXVVfc=
golang.org/x/crypto v0.32.0/go.mod h1:ZnnJkOaASj8g0AjIduWNlq2NRxL0PlBrbKVyZ6V/Ugc=
golang.org/x/crypto v0.46.0 h1:cKRW/pmt1pKAfetfu+RCEvjvZkA9RimPbh7bhFjGVBU=
golang.org/x/crypto v0.46.0/go.mod h1:Evb/oLKmMraqjZ2iQTwDwvCtJkczlDuTmdJXoZVzqU0=
golang.org/x/exp v0.0.0-20190121172915-509febef88a4/go.mod h1:CJ0aWSM057203Lf6IL+f9T1iT9GByDxfZKAQTCR3kQA=
golang.org/x/exp v0.0.0-20240531132922-fd00a4e0eefc h1:O9NuF4s+E/PvMIy+9IUZB9znFwUIXEWSstNjek6VpVg=
golang.org/x/exp v0.0.0-20240531132922-fd00a4e0eefc/go.mod h1:XtvwrStGgqGPLc4cjQfWqZHG1YFdYs6swckp8vpsjnc=
golang.org/x/exp v0.0.0-20251219203646-944ab1f22d93 h1:fQsdNF2N+/YewlRZiricy4P1iimyPKZ/xwniHj8Q2a0=
golang.org/x/exp v0.0.0-20251219203646-944ab1f22d93/go.mod h1:EPRbTFwzwjXj9NpYyyrvenVh9Y+GFeEvMNh7Xuz7xgU=
golang.org/x/lint v0.0.0-20181026193005-c67002cb31c3/go.mod h1:UVdnD1Gm6xHRNCYTkRU2/jEulfH38KcIWyp/GAMgvoE=
golang.org/x/lint v0.0.0-20190227174305-5b3e6a55c961/go.mod h1:wehouNa3lNwaWXcvxsM5YxQ5yQlVC4a0KAMCusXpPoU=
golang.org/x/lint v0.0.0-20190313153728-d0100b6bd8b3/go.mod h1:6SW0HCj/g11FgYtHlgUYUwCkIfeOF89ocIRzGO/8vkc=
golang.org/x/mod v0.6.0-dev.0.20220419223038-86c51ed26bb4/go.mod h1:jJ57K6gSWd91VN4djpZkiMVwK6gcyfeH4XE8wZrZaV4=
golang.org/x/mod v0.17.0 h1:zY54UmvipHiNd+pm+m0x9KhZ9hl1/7QNMyxXbc6ICqA=
golang.org/x/mod v0.17.0/go.mod h1:hTbmBsO62+eylJbnUtE2MGJUyE7QWk4xUqPFrRgJ+7c=
golang.org/x/mod v0.31.0 h1:HaW9xtz0+kOcWKwli0ZXy79Ix+UW/vOfmWI5QVd2tgI=
golang.org/x/mod v0.31.0/go.mod h1:43JraMp9cGx1Rx3AqioxrbrhNsLl2l/iNAvuBkrezpg=
golang.org/x/net v0.0.0-20180724234803-3673e40ba225/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20180826012351-8a410e7b638d/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
golang.org/x/net v0.0.0-20190108225652-1e06a53dbb7e/go.mod h1:mL1N/T3taQHkDXs73rZJwtUhF3w3ftmwwsq0BUmARs4=
@@ -264,8 +326,8 @@ golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v
golang.org/x/net v0.0.0-20210421230115-4e50805a0758/go.mod h1:72T/g9IO56b78aLF+1Kcs5dz7/ng1VjMUvfKvpfy+jM=
golang.org/x/net v0.0.0-20220722155237-a158d28d115b/go.mod h1:XRhObCWvk6IyKnWLug+ECip1KBveYUHfp+8e9klMJ9c=
golang.org/x/net v0.7.0/go.mod h1:2Tu9+aMcznHK/AK1HMvgo6xiTLG5rD5rZLDS+rp2Bjs=
golang.org/x/net v0.34.0 h1:Mb7Mrk043xzHgnRM88suvJFwzVrRfHEHJEl5/71CKw0=
golang.org/x/net v0.34.0/go.mod h1:di0qlW3YNM5oh6GqDGQr92MyTozJPmybPK4Ev/Gm31k=
golang.org/x/net v0.48.0 h1:zyQRTTrjc33Lhh0fBgT/H3oZq9WuvRR5gPC70xpDiQU=
golang.org/x/net v0.48.0/go.mod h1:+ndRgGjkh8FGtu1w1FGbEC31if4VrNVMuKTgcAAnQRY=
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
golang.org/x/oauth2 v0.0.0-20200107190931-bf48bf16ab8d/go.mod h1:gOpvHmFTYa4IltrdGE7lF6nIHvwfUNPOp7c8zoXwtLw=
golang.org/x/sync v0.0.0-20180314180146-1d60e4601c6f/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
@@ -273,8 +335,8 @@ golang.org/x/sync v0.0.0-20181108010431-42b317875d0f/go.mod h1:RxMgew5VJxzue5/jJ
golang.org/x/sync v0.0.0-20181221193216-37e7f081c4d4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20190423024810-112230192c58/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.0.0-20220722155255-886fb9371eb4/go.mod h1:RxMgew5VJxzue5/jJTE5uejpjVlOe/izrB70Jof72aM=
golang.org/x/sync v0.12.0 h1:MHc5BpPuC30uJk597Ri8TV3CNZcTLu6B6z4lJy+g6Jw=
golang.org/x/sync v0.12.0/go.mod h1:1dzgHSNfp02xaA81J2MS99Qcpr2w7fw1gpm99rleRqA=
golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4=
golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
golang.org/x/sys v0.0.0-20180830151530-49385e6e1522/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
golang.org/x/sys v0.0.0-20190412213103-97732733099d/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
@@ -288,11 +350,10 @@ golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBc
golang.org/x/sys v0.0.0-20220520151302-bc2c85ada10a/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220722155257-8c9f86f7a55f/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.5.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
golang.org/x/sys v0.29.0 h1:TPYlXGxvx1MGTn2GiZDhnjPA9wZzZeGKHHmKhHYvgaU=
golang.org/x/sys v0.29.0/go.mod h1:/VUhepiaJMQUp4+oa/7Zr1D23ma6VTLIYjOOTFZPUcA=
golang.org/x/sys v0.39.0 h1:CvCKL8MeisomCi6qNZ+wbb0DN9E5AATixKsvNtMoMFk=
golang.org/x/sys v0.39.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
golang.org/x/term v0.0.0-20210927222741-03fcf44c2211/go.mod h1:jbD1KX2456YbFQfuXm/mYQcufACuNUgVhRMnK/tPxf8=
golang.org/x/term v0.5.0/go.mod h1:jMB1sMXY+tzblOD4FWmEbocvup2/aLOaQEp7JmGp78k=
@@ -301,8 +362,8 @@ golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
golang.org/x/text v0.3.7/go.mod h1:u+2+/6zg+i71rQMx5EYifcz6MCKuco9NR6JIITiCfzQ=
golang.org/x/text v0.7.0/go.mod h1:mrYo+phRRbMaCq/xk9113O4dZlRixOauAjOtrjsXDZ8=
golang.org/x/text v0.23.0 h1:D71I7dUrlY+VX0gQShAThNGHFxZ13dGLBHQLVl1mJlY=
golang.org/x/text v0.23.0/go.mod h1:/BLNzu4aZCJ1+kcD0DNRotWKage4q2rGVAg4o22unh4=
golang.org/x/text v0.32.0 h1:ZD01bjUt1FQ9WJ0ClOL5vxgxOI/sVCNgX1YtKwcY0mU=
golang.org/x/text v0.32.0/go.mod h1:o/rUWzghvpD5TXrTIBuJU77MTaN0ljMWE47kxGJQ7jY=
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190114222345-bf090417da8b/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
golang.org/x/tools v0.0.0-20190226205152-f727befe758c/go.mod h1:9Yl7xja0Znq3iFh3HoIrodX9oNMXvdceNzlUR8zjMvY=
@@ -310,8 +371,8 @@ golang.org/x/tools v0.0.0-20190311212946-11955173bddd/go.mod h1:LCzVGOaR6xXOjkQ3
golang.org/x/tools v0.0.0-20190524140312-2c0ae7006135/go.mod h1:RgjU9mgBXZiqYHBnxXauZ1Gv1EHHAz9KjViQ78xBX0Q=
golang.org/x/tools v0.0.0-20191119224855-298f0cb1881e/go.mod h1:b+2E5dAYhXwXZwtnZ6UAqBI28+e2cm9otk0dWdXHAEo=
golang.org/x/tools v0.1.12/go.mod h1:hNGJHUnrk76NpqgfD5Aqm5Crs+Hm0VOH/i9J2+nxYbc=
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d h1:vU5i/LfpvrRCpgM/VPfJLg5KjxD3E+hfT1SH+d9zLwg=
golang.org/x/tools v0.21.1-0.20240508182429-e35e4ccd0d2d/go.mod h1:aiJjzUbINMkxbQROHiO6hDPo2LHcIPhhQsa9DLh0yGk=
golang.org/x/tools v0.40.0 h1:yLkxfA+Qnul4cs9QA3KnlFu0lVmd8JJfoq+E41uSutA=
golang.org/x/tools v0.40.0/go.mod h1:Ik/tzLRlbscWpqqMRjyWYDisX8bG13FrdXp3o4Sr9lc=
golang.org/x/xerrors v0.0.0-20190717185122-a985d3407aa7/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
golang.org/x/xerrors v0.0.0-20200804184101-5ec99f83aff1/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=

View File

@@ -14,15 +14,41 @@ tags:
description: Long-running operations
- name: Session
description: Session queries
- name: Stats
description: Statistics queries
- name: Submissions
description: Submission operations
- name: Scripts
description: Script operations
- name: ScriptPolicy
description: Script policy operations
- name: Thumbnails
description: Thumbnail operations
- name: Users
description: User operations
security:
- cookieAuth: []
paths:
/stats:
get:
summary: Get aggregate statistics
operationId: getStats
tags:
- Stats
security: []
responses:
"200":
description: Successful response
content:
application/json:
schema:
$ref: "#/components/schemas/Stats"
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/session/user:
get:
summary: Get information about the currently logged in user
@@ -158,6 +184,29 @@ paths:
application/json:
schema:
$ref: "#/components/schemas/Error"
/maps/{MapID}/combobulate:
post:
summary: Queue a map for combobulator processing
operationId: combobulateMap
tags:
- Maps
parameters:
- name: MapID
in: path
required: true
schema:
type: integer
format: int64
minimum: 0
responses:
"204":
description: Successful response
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/maps/{MapID}/download:
get:
summary: Download the map asset
@@ -186,6 +235,21 @@ paths:
application/json:
schema:
$ref: "#/components/schemas/Error"
/maps-admin/seed-combobulator:
post:
summary: Queue all maps for combobulator processing
operationId: seedCombobulator
tags:
- Maps
responses:
"204":
description: Successful response
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/mapfixes:
get:
summary: Get list of mapfixes
@@ -421,6 +485,30 @@ paths:
application/json:
schema:
$ref: "#/components/schemas/Error"
/mapfixes/{MapfixID}/description:
patch:
summary: Update description (submitter only)
operationId: updateMapfixDescription
tags:
- Mapfixes
parameters:
- $ref: '#/components/parameters/MapfixID'
requestBody:
required: true
content:
text/plain:
schema:
type: string
maxLength: 256
responses:
"204":
description: Successful response
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/mapfixes/{MapfixID}/completed:
post:
summary: Called by maptest when a player completes the map
@@ -1438,6 +1526,222 @@ paths:
application/json:
schema:
$ref: "#/components/schemas/Error"
/thumbnails/assets:
post:
summary: Batch fetch asset thumbnails
operationId: batchAssetThumbnails
tags:
- Thumbnails
security: []
requestBody:
required: true
content:
application/json:
schema:
type: object
required:
- assetIds
properties:
assetIds:
type: array
items:
type: integer
format: uint64
maxItems: 100
description: Array of asset IDs (max 100)
size:
type: string
enum:
- "150x150"
- "420x420"
- "768x432"
default: "420x420"
description: Thumbnail size
responses:
"200":
description: Successful response
content:
application/json:
schema:
type: object
properties:
thumbnails:
type: object
additionalProperties:
type: string
description: Map of asset ID to thumbnail URL
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/thumbnails/asset/{AssetID}:
get:
summary: Get single asset thumbnail
operationId: getAssetThumbnail
tags:
- Thumbnails
security: []
parameters:
- name: AssetID
in: path
required: true
schema:
type: integer
format: uint64
- name: size
in: query
schema:
type: string
enum:
- "150x150"
- "420x420"
- "768x432"
default: "420x420"
responses:
"302":
description: Redirect to thumbnail URL
headers:
Location:
description: URL to redirect to
schema:
type: string
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/thumbnails/users:
post:
summary: Batch fetch user avatar thumbnails
operationId: batchUserThumbnails
tags:
- Thumbnails
security: []
requestBody:
required: true
content:
application/json:
schema:
type: object
required:
- userIds
properties:
userIds:
type: array
items:
type: integer
format: uint64
maxItems: 100
description: Array of user IDs (max 100)
size:
type: string
enum:
- "150x150"
- "420x420"
- "768x432"
default: "150x150"
description: Thumbnail size
responses:
"200":
description: Successful response
content:
application/json:
schema:
type: object
properties:
thumbnails:
type: object
additionalProperties:
type: string
description: Map of user ID to thumbnail URL
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/thumbnails/user/{UserID}:
get:
summary: Get single user avatar thumbnail
operationId: getUserThumbnail
tags:
- Thumbnails
security: []
parameters:
- name: UserID
in: path
required: true
schema:
type: integer
format: uint64
- name: size
in: query
schema:
type: string
enum:
- "150x150"
- "420x420"
- "768x432"
default: "150x150"
responses:
"302":
description: Redirect to thumbnail URL
headers:
Location:
description: URL to redirect to
schema:
type: string
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
/usernames:
post:
summary: Batch fetch usernames
operationId: batchUsernames
tags:
- Users
security: []
requestBody:
required: true
content:
application/json:
schema:
type: object
required:
- userIds
properties:
userIds:
type: array
items:
type: integer
format: uint64
maxItems: 100
description: Array of user IDs (max 100)
responses:
"200":
description: Successful response
content:
application/json:
schema:
type: object
properties:
usernames:
type: object
additionalProperties:
type: string
description: Map of user ID to username
default:
description: General Error
content:
application/json:
schema:
$ref: "#/components/schemas/Error"
components:
securitySchemes:
cookieAuth:
@@ -2061,6 +2365,47 @@ components:
type: integer
format: int32
minimum: 0
Stats:
description: Aggregate statistics for submissions and mapfixes
type: object
properties:
TotalSubmissions:
type: integer
format: int64
minimum: 0
description: Total number of submissions
TotalMapfixes:
type: integer
format: int64
minimum: 0
description: Total number of mapfixes
ReleasedSubmissions:
type: integer
format: int64
minimum: 0
description: Number of released submissions
ReleasedMapfixes:
type: integer
format: int64
minimum: 0
description: Number of released mapfixes
SubmittedSubmissions:
type: integer
format: int64
minimum: 0
description: Number of submissions under review
SubmittedMapfixes:
type: integer
format: int64
minimum: 0
description: Number of mapfixes under review
required:
- TotalSubmissions
- TotalMapfixes
- ReleasedSubmissions
- ReleasedMapfixes
- SubmittedSubmissions
- SubmittedMapfixes
Error:
description: Represents error object
type: object

View File

@@ -4,15 +4,16 @@ package api
import (
"net/http"
"go.opentelemetry.io/otel"
"go.opentelemetry.io/otel/metric"
"go.opentelemetry.io/otel/trace"
"strings"
ht "github.com/ogen-go/ogen/http"
"github.com/ogen-go/ogen/middleware"
"github.com/ogen-go/ogen/ogenerrors"
"github.com/ogen-go/ogen/otelogen"
"go.opentelemetry.io/otel"
"go.opentelemetry.io/otel/attribute"
"go.opentelemetry.io/otel/metric"
"go.opentelemetry.io/otel/trace"
)
var (
@@ -32,6 +33,7 @@ type otelConfig struct {
Tracer trace.Tracer
MeterProvider metric.MeterProvider
Meter metric.Meter
Attributes []attribute.KeyValue
}
func (cfg *otelConfig) initOTEL() {
@@ -81,18 +83,8 @@ func (o otelOptionFunc) applyServer(c *serverConfig) {
func newServerConfig(opts ...ServerOption) serverConfig {
cfg := serverConfig{
NotFound: http.NotFound,
MethodNotAllowed: func(w http.ResponseWriter, r *http.Request, allowed string) {
status := http.StatusMethodNotAllowed
if r.Method == "OPTIONS" {
w.Header().Set("Access-Control-Allow-Methods", allowed)
w.Header().Set("Access-Control-Allow-Headers", "Content-Type")
status = http.StatusNoContent
} else {
w.Header().Set("Allow", allowed)
}
w.WriteHeader(status)
},
NotFound: http.NotFound,
MethodNotAllowed: nil,
ErrorHandler: ogenerrors.DefaultErrorHandler,
Middleware: nil,
MaxMultipartMemory: 32 << 20, // 32 MB
@@ -115,8 +107,44 @@ func (s baseServer) notFound(w http.ResponseWriter, r *http.Request) {
s.cfg.NotFound(w, r)
}
func (s baseServer) notAllowed(w http.ResponseWriter, r *http.Request, allowed string) {
s.cfg.MethodNotAllowed(w, r, allowed)
type notAllowedParams struct {
allowedMethods string
allowedHeaders map[string]string
acceptPost string
acceptPatch string
}
func (s baseServer) notAllowed(w http.ResponseWriter, r *http.Request, params notAllowedParams) {
h := w.Header()
isOptions := r.Method == "OPTIONS"
if isOptions {
h.Set("Access-Control-Allow-Methods", params.allowedMethods)
if params.allowedHeaders != nil {
m := r.Header.Get("Access-Control-Request-Method")
if m != "" {
allowedHeaders, ok := params.allowedHeaders[strings.ToUpper(m)]
if ok {
h.Set("Access-Control-Allow-Headers", allowedHeaders)
}
}
}
if params.acceptPost != "" {
h.Set("Accept-Post", params.acceptPost)
}
if params.acceptPatch != "" {
h.Set("Accept-Patch", params.acceptPatch)
}
}
if s.cfg.MethodNotAllowed != nil {
s.cfg.MethodNotAllowed(w, r, params.allowedMethods)
return
}
status := http.StatusNoContent
if !isOptions {
h.Set("Allow", params.allowedMethods)
status = http.StatusMethodNotAllowed
}
w.WriteHeader(status)
}
func (cfg serverConfig) baseServer() (s baseServer, err error) {
@@ -215,6 +243,13 @@ func WithMeterProvider(provider metric.MeterProvider) Option {
})
}
// WithAttributes specifies default otel attributes.
func WithAttributes(attributes ...attribute.KeyValue) Option {
return otelOptionFunc(func(cfg *otelConfig) {
cfg.Attributes = attributes
})
}
// WithClient specifies http client to use.
func WithClient(client ht.Client) ClientOption {
return optionFunc[clientConfig](func(cfg *clientConfig) {

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,19 @@
// Code generated by ogen, DO NOT EDIT.
package api
// setDefaults set default value of fields.
func (s *BatchAssetThumbnailsReq) setDefaults() {
{
val := BatchAssetThumbnailsReqSize("420x420")
s.Size.SetTo(val)
}
}
// setDefaults set default value of fields.
func (s *BatchUserThumbnailsReq) setDefaults() {
{
val := BatchUserThumbnailsReqSize("150x150")
s.Size.SetTo(val)
}
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -30,6 +30,10 @@ const (
ActionSubmissionTriggerUploadOperation OperationName = "ActionSubmissionTriggerUpload"
ActionSubmissionTriggerValidateOperation OperationName = "ActionSubmissionTriggerValidate"
ActionSubmissionValidatedOperation OperationName = "ActionSubmissionValidated"
BatchAssetThumbnailsOperation OperationName = "BatchAssetThumbnails"
BatchUserThumbnailsOperation OperationName = "BatchUserThumbnails"
BatchUsernamesOperation OperationName = "BatchUsernames"
CombobulateMapOperation OperationName = "CombobulateMap"
CreateMapfixOperation OperationName = "CreateMapfix"
CreateMapfixAuditCommentOperation OperationName = "CreateMapfixAuditComment"
CreateScriptOperation OperationName = "CreateScript"
@@ -40,12 +44,15 @@ const (
DeleteScriptOperation OperationName = "DeleteScript"
DeleteScriptPolicyOperation OperationName = "DeleteScriptPolicy"
DownloadMapAssetOperation OperationName = "DownloadMapAsset"
GetAssetThumbnailOperation OperationName = "GetAssetThumbnail"
GetMapOperation OperationName = "GetMap"
GetMapfixOperation OperationName = "GetMapfix"
GetOperationOperation OperationName = "GetOperation"
GetScriptOperation OperationName = "GetScript"
GetScriptPolicyOperation OperationName = "GetScriptPolicy"
GetStatsOperation OperationName = "GetStats"
GetSubmissionOperation OperationName = "GetSubmission"
GetUserThumbnailOperation OperationName = "GetUserThumbnail"
ListMapfixAuditEventsOperation OperationName = "ListMapfixAuditEvents"
ListMapfixesOperation OperationName = "ListMapfixes"
ListMapsOperation OperationName = "ListMaps"
@@ -54,11 +61,13 @@ const (
ListSubmissionAuditEventsOperation OperationName = "ListSubmissionAuditEvents"
ListSubmissionsOperation OperationName = "ListSubmissions"
ReleaseSubmissionsOperation OperationName = "ReleaseSubmissions"
SeedCombobulatorOperation OperationName = "SeedCombobulator"
SessionRolesOperation OperationName = "SessionRoles"
SessionUserOperation OperationName = "SessionUser"
SessionValidateOperation OperationName = "SessionValidate"
SetMapfixCompletedOperation OperationName = "SetMapfixCompleted"
SetSubmissionCompletedOperation OperationName = "SetSubmissionCompleted"
UpdateMapfixDescriptionOperation OperationName = "UpdateMapfixDescription"
UpdateMapfixModelOperation OperationName = "UpdateMapfixModel"
UpdateScriptOperation OperationName = "UpdateScript"
UpdateScriptPolicyOperation OperationName = "UpdateScriptPolicy"

File diff suppressed because it is too large Load Diff

View File

@@ -3,6 +3,7 @@
package api
import (
"bytes"
"fmt"
"io"
"mime"
@@ -10,13 +11,13 @@ import (
"github.com/go-faster/errors"
"github.com/go-faster/jx"
"github.com/ogen-go/ogen/ogenerrors"
"github.com/ogen-go/ogen/validate"
)
func (s *Server) decodeCreateMapfixRequest(r *http.Request) (
req *MapfixTriggerCreate,
func (s *Server) decodeBatchAssetThumbnailsRequest(r *http.Request) (
req *BatchAssetThumbnailsReq,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -37,22 +38,266 @@ func (s *Server) decodeCreateMapfixRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request BatchAssetThumbnailsReq
if err := func() error {
if err := request.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, rawBody, close, nil
default:
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeBatchUserThumbnailsRequest(r *http.Request) (
req *BatchUserThumbnailsReq,
rawBody []byte,
close func() error,
rerr error,
) {
var closers []func() error
close = func() error {
var merr error
// Close in reverse order, to match defer behavior.
for i := len(closers) - 1; i >= 0; i-- {
c := closers[i]
merr = errors.Join(merr, c())
}
return merr
}
defer func() {
if rerr != nil {
rerr = errors.Join(rerr, close())
}
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request BatchUserThumbnailsReq
if err := func() error {
if err := request.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, rawBody, close, nil
default:
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeBatchUsernamesRequest(r *http.Request) (
req *BatchUsernamesReq,
rawBody []byte,
close func() error,
rerr error,
) {
var closers []func() error
close = func() error {
var merr error
// Close in reverse order, to match defer behavior.
for i := len(closers) - 1; i >= 0; i-- {
c := closers[i]
merr = errors.Join(merr, c())
}
return merr
}
defer func() {
if rerr != nil {
rerr = errors.Join(rerr, close())
}
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request BatchUsernamesReq
if err := func() error {
if err := request.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, rawBody, close, nil
default:
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateMapfixRequest(r *http.Request) (
req *MapfixTriggerCreate,
rawBody []byte,
close func() error,
rerr error,
) {
var closers []func() error
close = func() error {
var merr error
// Close in reverse order, to match defer behavior.
for i := len(closers) - 1; i >= 0; i-- {
c := closers[i]
merr = errors.Join(merr, c())
}
return merr
}
defer func() {
if rerr != nil {
rerr = errors.Join(rerr, close())
}
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request MapfixTriggerCreate
@@ -70,7 +315,7 @@ func (s *Server) decodeCreateMapfixRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -78,16 +323,17 @@ func (s *Server) decodeCreateMapfixRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateMapfixAuditCommentRequest(r *http.Request) (
req CreateMapfixAuditCommentReq,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -108,20 +354,21 @@ func (s *Server) decodeCreateMapfixAuditCommentRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "text/plain":
reader := r.Body
request := CreateMapfixAuditCommentReq{Data: reader}
return request, close, nil
return request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateScriptRequest(r *http.Request) (
req *ScriptCreate,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -142,22 +389,29 @@ func (s *Server) decodeCreateScriptRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request ScriptCreate
@@ -175,7 +429,7 @@ func (s *Server) decodeCreateScriptRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -183,16 +437,17 @@ func (s *Server) decodeCreateScriptRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateScriptPolicyRequest(r *http.Request) (
req *ScriptPolicyCreate,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -213,22 +468,29 @@ func (s *Server) decodeCreateScriptPolicyRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request ScriptPolicyCreate
@@ -246,7 +508,7 @@ func (s *Server) decodeCreateScriptPolicyRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -254,16 +516,17 @@ func (s *Server) decodeCreateScriptPolicyRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateSubmissionRequest(r *http.Request) (
req *SubmissionTriggerCreate,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -284,22 +547,29 @@ func (s *Server) decodeCreateSubmissionRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request SubmissionTriggerCreate
@@ -317,7 +587,7 @@ func (s *Server) decodeCreateSubmissionRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -325,16 +595,17 @@ func (s *Server) decodeCreateSubmissionRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateSubmissionAdminRequest(r *http.Request) (
req *SubmissionTriggerCreate,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -355,22 +626,29 @@ func (s *Server) decodeCreateSubmissionAdminRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request SubmissionTriggerCreate
@@ -388,7 +666,7 @@ func (s *Server) decodeCreateSubmissionAdminRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -396,16 +674,17 @@ func (s *Server) decodeCreateSubmissionAdminRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeCreateSubmissionAuditCommentRequest(r *http.Request) (
req CreateSubmissionAuditCommentReq,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -426,20 +705,21 @@ func (s *Server) decodeCreateSubmissionAuditCommentRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "text/plain":
reader := r.Body
request := CreateSubmissionAuditCommentReq{Data: reader}
return request, close, nil
return request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeReleaseSubmissionsRequest(r *http.Request) (
req []ReleaseInfo,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -460,22 +740,29 @@ func (s *Server) decodeReleaseSubmissionsRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request []ReleaseInfo
@@ -501,7 +788,7 @@ func (s *Server) decodeReleaseSubmissionsRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if request == nil {
@@ -534,16 +821,17 @@ func (s *Server) decodeReleaseSubmissionsRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return request, close, nil
return request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeUpdateScriptRequest(r *http.Request) (
req *ScriptUpdate,
func (s *Server) decodeUpdateMapfixDescriptionRequest(r *http.Request) (
req UpdateMapfixDescriptionReq,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -564,22 +852,64 @@ func (s *Server) decodeUpdateScriptRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "text/plain":
reader := r.Body
request := UpdateMapfixDescriptionReq{Data: reader}
return request, rawBody, close, nil
default:
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeUpdateScriptRequest(r *http.Request) (
req *ScriptUpdate,
rawBody []byte,
close func() error,
rerr error,
) {
var closers []func() error
close = func() error {
var merr error
// Close in reverse order, to match defer behavior.
for i := len(closers) - 1; i >= 0; i-- {
c := closers[i]
merr = errors.Join(merr, c())
}
return merr
}
defer func() {
if rerr != nil {
rerr = errors.Join(rerr, close())
}
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request ScriptUpdate
@@ -597,7 +927,7 @@ func (s *Server) decodeUpdateScriptRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -605,16 +935,17 @@ func (s *Server) decodeUpdateScriptRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}
func (s *Server) decodeUpdateScriptPolicyRequest(r *http.Request) (
req *ScriptPolicyUpdate,
rawBody []byte,
close func() error,
rerr error,
) {
@@ -635,22 +966,29 @@ func (s *Server) decodeUpdateScriptPolicyRequest(r *http.Request) (
}()
ct, _, err := mime.ParseMediaType(r.Header.Get("Content-Type"))
if err != nil {
return req, close, errors.Wrap(err, "parse media type")
return req, rawBody, close, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
if r.ContentLength == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
buf, err := io.ReadAll(r.Body)
defer func() {
_ = r.Body.Close()
}()
if err != nil {
return req, close, err
return req, rawBody, close, err
}
// Reset the body to allow for downstream reading.
r.Body = io.NopCloser(bytes.NewBuffer(buf))
if len(buf) == 0 {
return req, close, validate.ErrBodyRequired
return req, rawBody, close, validate.ErrBodyRequired
}
rawBody = append(rawBody, buf...)
d := jx.DecodeBytes(buf)
var request ScriptPolicyUpdate
@@ -668,7 +1006,7 @@ func (s *Server) decodeUpdateScriptPolicyRequest(r *http.Request) (
Body: buf,
Err: err,
}
return req, close, err
return req, rawBody, close, err
}
if err := func() error {
if err := request.Validate(); err != nil {
@@ -676,10 +1014,10 @@ func (s *Server) decodeUpdateScriptPolicyRequest(r *http.Request) (
}
return nil
}(); err != nil {
return req, close, errors.Wrap(err, "validate")
return req, rawBody, close, errors.Wrap(err, "validate")
}
return &request, close, nil
return &request, rawBody, close, nil
default:
return req, close, validate.InvalidContentType(ct)
return req, rawBody, close, validate.InvalidContentType(ct)
}
}

View File

@@ -7,10 +7,51 @@ import (
"net/http"
"github.com/go-faster/jx"
ht "github.com/ogen-go/ogen/http"
)
func encodeBatchAssetThumbnailsRequest(
req *BatchAssetThumbnailsReq,
r *http.Request,
) error {
const contentType = "application/json"
e := new(jx.Encoder)
{
req.Encode(e)
}
encoded := e.Bytes()
ht.SetBody(r, bytes.NewReader(encoded), contentType)
return nil
}
func encodeBatchUserThumbnailsRequest(
req *BatchUserThumbnailsReq,
r *http.Request,
) error {
const contentType = "application/json"
e := new(jx.Encoder)
{
req.Encode(e)
}
encoded := e.Bytes()
ht.SetBody(r, bytes.NewReader(encoded), contentType)
return nil
}
func encodeBatchUsernamesRequest(
req *BatchUsernamesReq,
r *http.Request,
) error {
const contentType = "application/json"
e := new(jx.Encoder)
{
req.Encode(e)
}
encoded := e.Bytes()
ht.SetBody(r, bytes.NewReader(encoded), contentType)
return nil
}
func encodeCreateMapfixRequest(
req *MapfixTriggerCreate,
r *http.Request,
@@ -119,6 +160,16 @@ func encodeReleaseSubmissionsRequest(
return nil
}
func encodeUpdateMapfixDescriptionRequest(
req UpdateMapfixDescriptionReq,
r *http.Request,
) error {
const contentType = "text/plain"
body := req
ht.SetBody(r, body, contentType)
return nil
}
func encodeUpdateScriptRequest(
req *ScriptUpdate,
r *http.Request,

View File

@@ -11,8 +11,9 @@ import (
"github.com/go-faster/errors"
"github.com/go-faster/jx"
"github.com/ogen-go/ogen/conv"
"github.com/ogen-go/ogen/ogenerrors"
"github.com/ogen-go/ogen/uri"
"github.com/ogen-go/ogen/validate"
)
@@ -1456,6 +1457,342 @@ func decodeActionSubmissionValidatedResponse(resp *http.Response) (res *ActionSu
return res, errors.Wrap(defRes, "error")
}
func decodeBatchAssetThumbnailsResponse(resp *http.Response) (res *BatchAssetThumbnailsOK, _ error) {
switch resp.StatusCode {
case 200:
// Code 200.
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response BatchAssetThumbnailsOK
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
return &response, nil
default:
return res, validate.InvalidContentType(ct)
}
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeBatchUserThumbnailsResponse(resp *http.Response) (res *BatchUserThumbnailsOK, _ error) {
switch resp.StatusCode {
case 200:
// Code 200.
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response BatchUserThumbnailsOK
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
return &response, nil
default:
return res, validate.InvalidContentType(ct)
}
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeBatchUsernamesResponse(resp *http.Response) (res *BatchUsernamesOK, _ error) {
switch resp.StatusCode {
case 200:
// Code 200.
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response BatchUsernamesOK
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
return &response, nil
default:
return res, validate.InvalidContentType(ct)
}
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeCombobulateMapResponse(resp *http.Response) (res *CombobulateMapNoContent, _ error) {
switch resp.StatusCode {
case 204:
// Code 204.
return &CombobulateMapNoContent{}, nil
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeCreateMapfixResponse(resp *http.Response) (res *OperationID, _ error) {
switch resp.StatusCode {
case 201:
@@ -2277,6 +2614,105 @@ func decodeDownloadMapAssetResponse(resp *http.Response) (res DownloadMapAssetOK
return res, errors.Wrap(defRes, "error")
}
func decodeGetAssetThumbnailResponse(resp *http.Response) (res *GetAssetThumbnailFound, _ error) {
switch resp.StatusCode {
case 302:
// Code 302.
var wrapper GetAssetThumbnailFound
h := uri.NewHeaderDecoder(resp.Header)
// Parse "Location" header.
{
cfg := uri.HeaderParameterDecodingConfig{
Name: "Location",
Explode: false,
}
if err := func() error {
if err := h.HasParam(cfg); err == nil {
if err := h.DecodeParam(cfg, func(d uri.Decoder) error {
var wrapperDotLocationVal string
if err := func() error {
val, err := d.DecodeValue()
if err != nil {
return err
}
c, err := conv.ToString(val)
if err != nil {
return err
}
wrapperDotLocationVal = c
return nil
}(); err != nil {
return err
}
wrapper.Location.SetTo(wrapperDotLocationVal)
return nil
}); err != nil {
return err
}
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "parse Location header")
}
}
return &wrapper, nil
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeGetMapResponse(resp *http.Response) (res *Map, _ error) {
switch resp.StatusCode {
case 200:
@@ -2782,6 +3218,107 @@ func decodeGetScriptPolicyResponse(resp *http.Response) (res *ScriptPolicy, _ er
return res, errors.Wrap(defRes, "error")
}
func decodeGetStatsResponse(resp *http.Response) (res *Stats, _ error) {
switch resp.StatusCode {
case 200:
// Code 200.
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Stats
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &response, nil
default:
return res, validate.InvalidContentType(ct)
}
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeGetSubmissionResponse(resp *http.Response) (res *Submission, _ error) {
switch resp.StatusCode {
case 200:
@@ -2883,6 +3420,105 @@ func decodeGetSubmissionResponse(resp *http.Response) (res *Submission, _ error)
return res, errors.Wrap(defRes, "error")
}
func decodeGetUserThumbnailResponse(resp *http.Response) (res *GetUserThumbnailFound, _ error) {
switch resp.StatusCode {
case 302:
// Code 302.
var wrapper GetUserThumbnailFound
h := uri.NewHeaderDecoder(resp.Header)
// Parse "Location" header.
{
cfg := uri.HeaderParameterDecodingConfig{
Name: "Location",
Explode: false,
}
if err := func() error {
if err := h.HasParam(cfg); err == nil {
if err := h.DecodeParam(cfg, func(d uri.Decoder) error {
var wrapperDotLocationVal string
if err := func() error {
val, err := d.DecodeValue()
if err != nil {
return err
}
c, err := conv.ToString(val)
if err != nil {
return err
}
wrapperDotLocationVal = c
return nil
}(); err != nil {
return err
}
wrapper.Location.SetTo(wrapperDotLocationVal)
return nil
}); err != nil {
return err
}
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "parse Location header")
}
}
return &wrapper, nil
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeListMapfixAuditEventsResponse(resp *http.Response) (res []AuditEvent, _ error) {
switch resp.StatusCode {
case 200:
@@ -3816,6 +4452,66 @@ func decodeReleaseSubmissionsResponse(resp *http.Response) (res *OperationID, _
return res, errors.Wrap(defRes, "error")
}
func decodeSeedCombobulatorResponse(resp *http.Response) (res *SeedCombobulatorNoContent, _ error) {
switch resp.StatusCode {
case 204:
// Code 204.
return &SeedCombobulatorNoContent{}, nil
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeSessionRolesResponse(resp *http.Response) (res *Roles, _ error) {
switch resp.StatusCode {
case 200:
@@ -4232,6 +4928,66 @@ func decodeSetSubmissionCompletedResponse(resp *http.Response) (res *SetSubmissi
return res, errors.Wrap(defRes, "error")
}
func decodeUpdateMapfixDescriptionResponse(resp *http.Response) (res *UpdateMapfixDescriptionNoContent, _ error) {
switch resp.StatusCode {
case 204:
// Code 204.
return &UpdateMapfixDescriptionNoContent{}, nil
}
// Convenient error response.
defRes, err := func() (res *ErrorStatusCode, err error) {
ct, _, err := mime.ParseMediaType(resp.Header.Get("Content-Type"))
if err != nil {
return res, errors.Wrap(err, "parse media type")
}
switch {
case ct == "application/json":
buf, err := io.ReadAll(resp.Body)
if err != nil {
return res, err
}
d := jx.DecodeBytes(buf)
var response Error
if err := func() error {
if err := response.Decode(d); err != nil {
return err
}
if err := d.Skip(); err != io.EOF {
return errors.New("unexpected trailing data")
}
return nil
}(); err != nil {
err = &ogenerrors.DecodeBodyError{
ContentType: ct,
Body: buf,
Err: err,
}
return res, err
}
// Validate response.
if err := func() error {
if err := response.Validate(); err != nil {
return err
}
return nil
}(); err != nil {
return res, errors.Wrap(err, "validate")
}
return &ErrorStatusCode{
StatusCode: resp.StatusCode,
Response: response,
}, nil
default:
return res, validate.InvalidContentType(ct)
}
}()
if err != nil {
return res, errors.Wrapf(err, "default (code %d)", resp.StatusCode)
}
return res, errors.Wrap(defRes, "error")
}
func decodeUpdateMapfixModelResponse(resp *http.Response) (res *UpdateMapfixModelNoContent, _ error) {
switch resp.StatusCode {
case 204:

View File

@@ -8,10 +8,11 @@ import (
"github.com/go-faster/errors"
"github.com/go-faster/jx"
"github.com/ogen-go/ogen/conv"
ht "github.com/ogen-go/ogen/http"
"github.com/ogen-go/ogen/uri"
"go.opentelemetry.io/otel/codes"
"go.opentelemetry.io/otel/trace"
ht "github.com/ogen-go/ogen/http"
)
func encodeActionMapfixAcceptedResponse(response *ActionMapfixAcceptedNoContent, w http.ResponseWriter, span trace.Span) error {
@@ -182,6 +183,55 @@ func encodeActionSubmissionValidatedResponse(response *ActionSubmissionValidated
return nil
}
func encodeBatchAssetThumbnailsResponse(response *BatchAssetThumbnailsOK, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
span.SetStatus(codes.Ok, http.StatusText(200))
e := new(jx.Encoder)
response.Encode(e)
if _, err := e.WriteTo(w); err != nil {
return errors.Wrap(err, "write")
}
return nil
}
func encodeBatchUserThumbnailsResponse(response *BatchUserThumbnailsOK, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
span.SetStatus(codes.Ok, http.StatusText(200))
e := new(jx.Encoder)
response.Encode(e)
if _, err := e.WriteTo(w); err != nil {
return errors.Wrap(err, "write")
}
return nil
}
func encodeBatchUsernamesResponse(response *BatchUsernamesOK, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
span.SetStatus(codes.Ok, http.StatusText(200))
e := new(jx.Encoder)
response.Encode(e)
if _, err := e.WriteTo(w); err != nil {
return errors.Wrap(err, "write")
}
return nil
}
func encodeCombobulateMapResponse(response *CombobulateMapNoContent, w http.ResponseWriter, span trace.Span) error {
w.WriteHeader(204)
span.SetStatus(codes.Ok, http.StatusText(204))
return nil
}
func encodeCreateMapfixResponse(response *OperationID, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(201)
@@ -296,6 +346,33 @@ func encodeDownloadMapAssetResponse(response DownloadMapAssetOK, w http.Response
return nil
}
func encodeGetAssetThumbnailResponse(response *GetAssetThumbnailFound, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Access-Control-Expose-Headers", "Location")
// Encoding response headers.
{
h := uri.NewHeaderEncoder(w.Header())
// Encode "Location" header.
{
cfg := uri.HeaderParameterEncodingConfig{
Name: "Location",
Explode: false,
}
if err := h.EncodeParam(cfg, func(e uri.Encoder) error {
if val, ok := response.Location.Get(); ok {
return e.EncodeValue(conv.StringToString(val))
}
return nil
}); err != nil {
return errors.Wrap(err, "encode Location header")
}
}
}
w.WriteHeader(302)
span.SetStatus(codes.Ok, http.StatusText(302))
return nil
}
func encodeGetMapResponse(response *Map, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
@@ -366,6 +443,20 @@ func encodeGetScriptPolicyResponse(response *ScriptPolicy, w http.ResponseWriter
return nil
}
func encodeGetStatsResponse(response *Stats, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
span.SetStatus(codes.Ok, http.StatusText(200))
e := new(jx.Encoder)
response.Encode(e)
if _, err := e.WriteTo(w); err != nil {
return errors.Wrap(err, "write")
}
return nil
}
func encodeGetSubmissionResponse(response *Submission, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
@@ -380,6 +471,33 @@ func encodeGetSubmissionResponse(response *Submission, w http.ResponseWriter, sp
return nil
}
func encodeGetUserThumbnailResponse(response *GetUserThumbnailFound, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Access-Control-Expose-Headers", "Location")
// Encoding response headers.
{
h := uri.NewHeaderEncoder(w.Header())
// Encode "Location" header.
{
cfg := uri.HeaderParameterEncodingConfig{
Name: "Location",
Explode: false,
}
if err := h.EncodeParam(cfg, func(e uri.Encoder) error {
if val, ok := response.Location.Get(); ok {
return e.EncodeValue(conv.StringToString(val))
}
return nil
}); err != nil {
return errors.Wrap(err, "encode Location header")
}
}
}
w.WriteHeader(302)
span.SetStatus(codes.Ok, http.StatusText(302))
return nil
}
func encodeListMapfixAuditEventsResponse(response []AuditEvent, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
@@ -512,6 +630,13 @@ func encodeReleaseSubmissionsResponse(response *OperationID, w http.ResponseWrit
return nil
}
func encodeSeedCombobulatorResponse(response *SeedCombobulatorNoContent, w http.ResponseWriter, span trace.Span) error {
w.WriteHeader(204)
span.SetStatus(codes.Ok, http.StatusText(204))
return nil
}
func encodeSessionRolesResponse(response *Roles, w http.ResponseWriter, span trace.Span) error {
w.Header().Set("Content-Type", "application/json; charset=utf-8")
w.WriteHeader(200)
@@ -568,6 +693,13 @@ func encodeSetSubmissionCompletedResponse(response *SetSubmissionCompletedNoCont
return nil
}
func encodeUpdateMapfixDescriptionResponse(response *UpdateMapfixDescriptionNoContent, w http.ResponseWriter, span trace.Span) error {
w.WriteHeader(204)
span.SetStatus(codes.Ok, http.StatusText(204))
return nil
}
func encodeUpdateMapfixModelResponse(response *UpdateMapfixModelNoContent, w http.ResponseWriter, span trace.Span) error {
w.WriteHeader(204)
span.SetStatus(codes.Ok, http.StatusText(204))

File diff suppressed because it is too large Load Diff

View File

@@ -7,6 +7,7 @@ import (
"io"
"time"
"github.com/go-faster/errors"
"github.com/go-faster/jx"
)
@@ -192,6 +193,257 @@ func (s *AuditEventEventData) init() AuditEventEventData {
return m
}
type BatchAssetThumbnailsOK struct {
// Map of asset ID to thumbnail URL.
Thumbnails OptBatchAssetThumbnailsOKThumbnails `json:"thumbnails"`
}
// GetThumbnails returns the value of Thumbnails.
func (s *BatchAssetThumbnailsOK) GetThumbnails() OptBatchAssetThumbnailsOKThumbnails {
return s.Thumbnails
}
// SetThumbnails sets the value of Thumbnails.
func (s *BatchAssetThumbnailsOK) SetThumbnails(val OptBatchAssetThumbnailsOKThumbnails) {
s.Thumbnails = val
}
// Map of asset ID to thumbnail URL.
type BatchAssetThumbnailsOKThumbnails map[string]string
func (s *BatchAssetThumbnailsOKThumbnails) init() BatchAssetThumbnailsOKThumbnails {
m := *s
if m == nil {
m = map[string]string{}
*s = m
}
return m
}
type BatchAssetThumbnailsReq struct {
// Array of asset IDs (max 100).
AssetIds []uint64 `json:"assetIds"`
// Thumbnail size.
Size OptBatchAssetThumbnailsReqSize `json:"size"`
}
// GetAssetIds returns the value of AssetIds.
func (s *BatchAssetThumbnailsReq) GetAssetIds() []uint64 {
return s.AssetIds
}
// GetSize returns the value of Size.
func (s *BatchAssetThumbnailsReq) GetSize() OptBatchAssetThumbnailsReqSize {
return s.Size
}
// SetAssetIds sets the value of AssetIds.
func (s *BatchAssetThumbnailsReq) SetAssetIds(val []uint64) {
s.AssetIds = val
}
// SetSize sets the value of Size.
func (s *BatchAssetThumbnailsReq) SetSize(val OptBatchAssetThumbnailsReqSize) {
s.Size = val
}
// Thumbnail size.
type BatchAssetThumbnailsReqSize string
const (
BatchAssetThumbnailsReqSize150x150 BatchAssetThumbnailsReqSize = "150x150"
BatchAssetThumbnailsReqSize420x420 BatchAssetThumbnailsReqSize = "420x420"
BatchAssetThumbnailsReqSize768x432 BatchAssetThumbnailsReqSize = "768x432"
)
// AllValues returns all BatchAssetThumbnailsReqSize values.
func (BatchAssetThumbnailsReqSize) AllValues() []BatchAssetThumbnailsReqSize {
return []BatchAssetThumbnailsReqSize{
BatchAssetThumbnailsReqSize150x150,
BatchAssetThumbnailsReqSize420x420,
BatchAssetThumbnailsReqSize768x432,
}
}
// MarshalText implements encoding.TextMarshaler.
func (s BatchAssetThumbnailsReqSize) MarshalText() ([]byte, error) {
switch s {
case BatchAssetThumbnailsReqSize150x150:
return []byte(s), nil
case BatchAssetThumbnailsReqSize420x420:
return []byte(s), nil
case BatchAssetThumbnailsReqSize768x432:
return []byte(s), nil
default:
return nil, errors.Errorf("invalid value: %q", s)
}
}
// UnmarshalText implements encoding.TextUnmarshaler.
func (s *BatchAssetThumbnailsReqSize) UnmarshalText(data []byte) error {
switch BatchAssetThumbnailsReqSize(data) {
case BatchAssetThumbnailsReqSize150x150:
*s = BatchAssetThumbnailsReqSize150x150
return nil
case BatchAssetThumbnailsReqSize420x420:
*s = BatchAssetThumbnailsReqSize420x420
return nil
case BatchAssetThumbnailsReqSize768x432:
*s = BatchAssetThumbnailsReqSize768x432
return nil
default:
return errors.Errorf("invalid value: %q", data)
}
}
type BatchUserThumbnailsOK struct {
// Map of user ID to thumbnail URL.
Thumbnails OptBatchUserThumbnailsOKThumbnails `json:"thumbnails"`
}
// GetThumbnails returns the value of Thumbnails.
func (s *BatchUserThumbnailsOK) GetThumbnails() OptBatchUserThumbnailsOKThumbnails {
return s.Thumbnails
}
// SetThumbnails sets the value of Thumbnails.
func (s *BatchUserThumbnailsOK) SetThumbnails(val OptBatchUserThumbnailsOKThumbnails) {
s.Thumbnails = val
}
// Map of user ID to thumbnail URL.
type BatchUserThumbnailsOKThumbnails map[string]string
func (s *BatchUserThumbnailsOKThumbnails) init() BatchUserThumbnailsOKThumbnails {
m := *s
if m == nil {
m = map[string]string{}
*s = m
}
return m
}
type BatchUserThumbnailsReq struct {
// Array of user IDs (max 100).
UserIds []uint64 `json:"userIds"`
// Thumbnail size.
Size OptBatchUserThumbnailsReqSize `json:"size"`
}
// GetUserIds returns the value of UserIds.
func (s *BatchUserThumbnailsReq) GetUserIds() []uint64 {
return s.UserIds
}
// GetSize returns the value of Size.
func (s *BatchUserThumbnailsReq) GetSize() OptBatchUserThumbnailsReqSize {
return s.Size
}
// SetUserIds sets the value of UserIds.
func (s *BatchUserThumbnailsReq) SetUserIds(val []uint64) {
s.UserIds = val
}
// SetSize sets the value of Size.
func (s *BatchUserThumbnailsReq) SetSize(val OptBatchUserThumbnailsReqSize) {
s.Size = val
}
// Thumbnail size.
type BatchUserThumbnailsReqSize string
const (
BatchUserThumbnailsReqSize150x150 BatchUserThumbnailsReqSize = "150x150"
BatchUserThumbnailsReqSize420x420 BatchUserThumbnailsReqSize = "420x420"
BatchUserThumbnailsReqSize768x432 BatchUserThumbnailsReqSize = "768x432"
)
// AllValues returns all BatchUserThumbnailsReqSize values.
func (BatchUserThumbnailsReqSize) AllValues() []BatchUserThumbnailsReqSize {
return []BatchUserThumbnailsReqSize{
BatchUserThumbnailsReqSize150x150,
BatchUserThumbnailsReqSize420x420,
BatchUserThumbnailsReqSize768x432,
}
}
// MarshalText implements encoding.TextMarshaler.
func (s BatchUserThumbnailsReqSize) MarshalText() ([]byte, error) {
switch s {
case BatchUserThumbnailsReqSize150x150:
return []byte(s), nil
case BatchUserThumbnailsReqSize420x420:
return []byte(s), nil
case BatchUserThumbnailsReqSize768x432:
return []byte(s), nil
default:
return nil, errors.Errorf("invalid value: %q", s)
}
}
// UnmarshalText implements encoding.TextUnmarshaler.
func (s *BatchUserThumbnailsReqSize) UnmarshalText(data []byte) error {
switch BatchUserThumbnailsReqSize(data) {
case BatchUserThumbnailsReqSize150x150:
*s = BatchUserThumbnailsReqSize150x150
return nil
case BatchUserThumbnailsReqSize420x420:
*s = BatchUserThumbnailsReqSize420x420
return nil
case BatchUserThumbnailsReqSize768x432:
*s = BatchUserThumbnailsReqSize768x432
return nil
default:
return errors.Errorf("invalid value: %q", data)
}
}
type BatchUsernamesOK struct {
// Map of user ID to username.
Usernames OptBatchUsernamesOKUsernames `json:"usernames"`
}
// GetUsernames returns the value of Usernames.
func (s *BatchUsernamesOK) GetUsernames() OptBatchUsernamesOKUsernames {
return s.Usernames
}
// SetUsernames sets the value of Usernames.
func (s *BatchUsernamesOK) SetUsernames(val OptBatchUsernamesOKUsernames) {
s.Usernames = val
}
// Map of user ID to username.
type BatchUsernamesOKUsernames map[string]string
func (s *BatchUsernamesOKUsernames) init() BatchUsernamesOKUsernames {
m := *s
if m == nil {
m = map[string]string{}
*s = m
}
return m
}
type BatchUsernamesReq struct {
// Array of user IDs (max 100).
UserIds []uint64 `json:"userIds"`
}
// GetUserIds returns the value of UserIds.
func (s *BatchUsernamesReq) GetUserIds() []uint64 {
return s.UserIds
}
// SetUserIds sets the value of UserIds.
func (s *BatchUsernamesReq) SetUserIds(val []uint64) {
s.UserIds = val
}
// CombobulateMapNoContent is response for CombobulateMap operation.
type CombobulateMapNoContent struct{}
type CookieAuth struct {
APIKey string
Roles []string
@@ -324,6 +576,132 @@ func (s *ErrorStatusCode) SetResponse(val Error) {
s.Response = val
}
// GetAssetThumbnailFound is response for GetAssetThumbnail operation.
type GetAssetThumbnailFound struct {
Location OptString
}
// GetLocation returns the value of Location.
func (s *GetAssetThumbnailFound) GetLocation() OptString {
return s.Location
}
// SetLocation sets the value of Location.
func (s *GetAssetThumbnailFound) SetLocation(val OptString) {
s.Location = val
}
type GetAssetThumbnailSize string
const (
GetAssetThumbnailSize150x150 GetAssetThumbnailSize = "150x150"
GetAssetThumbnailSize420x420 GetAssetThumbnailSize = "420x420"
GetAssetThumbnailSize768x432 GetAssetThumbnailSize = "768x432"
)
// AllValues returns all GetAssetThumbnailSize values.
func (GetAssetThumbnailSize) AllValues() []GetAssetThumbnailSize {
return []GetAssetThumbnailSize{
GetAssetThumbnailSize150x150,
GetAssetThumbnailSize420x420,
GetAssetThumbnailSize768x432,
}
}
// MarshalText implements encoding.TextMarshaler.
func (s GetAssetThumbnailSize) MarshalText() ([]byte, error) {
switch s {
case GetAssetThumbnailSize150x150:
return []byte(s), nil
case GetAssetThumbnailSize420x420:
return []byte(s), nil
case GetAssetThumbnailSize768x432:
return []byte(s), nil
default:
return nil, errors.Errorf("invalid value: %q", s)
}
}
// UnmarshalText implements encoding.TextUnmarshaler.
func (s *GetAssetThumbnailSize) UnmarshalText(data []byte) error {
switch GetAssetThumbnailSize(data) {
case GetAssetThumbnailSize150x150:
*s = GetAssetThumbnailSize150x150
return nil
case GetAssetThumbnailSize420x420:
*s = GetAssetThumbnailSize420x420
return nil
case GetAssetThumbnailSize768x432:
*s = GetAssetThumbnailSize768x432
return nil
default:
return errors.Errorf("invalid value: %q", data)
}
}
// GetUserThumbnailFound is response for GetUserThumbnail operation.
type GetUserThumbnailFound struct {
Location OptString
}
// GetLocation returns the value of Location.
func (s *GetUserThumbnailFound) GetLocation() OptString {
return s.Location
}
// SetLocation sets the value of Location.
func (s *GetUserThumbnailFound) SetLocation(val OptString) {
s.Location = val
}
type GetUserThumbnailSize string
const (
GetUserThumbnailSize150x150 GetUserThumbnailSize = "150x150"
GetUserThumbnailSize420x420 GetUserThumbnailSize = "420x420"
GetUserThumbnailSize768x432 GetUserThumbnailSize = "768x432"
)
// AllValues returns all GetUserThumbnailSize values.
func (GetUserThumbnailSize) AllValues() []GetUserThumbnailSize {
return []GetUserThumbnailSize{
GetUserThumbnailSize150x150,
GetUserThumbnailSize420x420,
GetUserThumbnailSize768x432,
}
}
// MarshalText implements encoding.TextMarshaler.
func (s GetUserThumbnailSize) MarshalText() ([]byte, error) {
switch s {
case GetUserThumbnailSize150x150:
return []byte(s), nil
case GetUserThumbnailSize420x420:
return []byte(s), nil
case GetUserThumbnailSize768x432:
return []byte(s), nil
default:
return nil, errors.Errorf("invalid value: %q", s)
}
}
// UnmarshalText implements encoding.TextUnmarshaler.
func (s *GetUserThumbnailSize) UnmarshalText(data []byte) error {
switch GetUserThumbnailSize(data) {
case GetUserThumbnailSize150x150:
*s = GetUserThumbnailSize150x150
return nil
case GetUserThumbnailSize420x420:
*s = GetUserThumbnailSize420x420
return nil
case GetUserThumbnailSize768x432:
*s = GetUserThumbnailSize768x432
return nil
default:
return errors.Errorf("invalid value: %q", data)
}
}
// Ref: #/components/schemas/Map
type Map struct {
ID int64 `json:"ID"`
@@ -777,6 +1155,328 @@ func (s *OperationID) SetOperationID(val int32) {
s.OperationID = val
}
// NewOptBatchAssetThumbnailsOKThumbnails returns new OptBatchAssetThumbnailsOKThumbnails with value set to v.
func NewOptBatchAssetThumbnailsOKThumbnails(v BatchAssetThumbnailsOKThumbnails) OptBatchAssetThumbnailsOKThumbnails {
return OptBatchAssetThumbnailsOKThumbnails{
Value: v,
Set: true,
}
}
// OptBatchAssetThumbnailsOKThumbnails is optional BatchAssetThumbnailsOKThumbnails.
type OptBatchAssetThumbnailsOKThumbnails struct {
Value BatchAssetThumbnailsOKThumbnails
Set bool
}
// IsSet returns true if OptBatchAssetThumbnailsOKThumbnails was set.
func (o OptBatchAssetThumbnailsOKThumbnails) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptBatchAssetThumbnailsOKThumbnails) Reset() {
var v BatchAssetThumbnailsOKThumbnails
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptBatchAssetThumbnailsOKThumbnails) SetTo(v BatchAssetThumbnailsOKThumbnails) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptBatchAssetThumbnailsOKThumbnails) Get() (v BatchAssetThumbnailsOKThumbnails, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptBatchAssetThumbnailsOKThumbnails) Or(d BatchAssetThumbnailsOKThumbnails) BatchAssetThumbnailsOKThumbnails {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptBatchAssetThumbnailsReqSize returns new OptBatchAssetThumbnailsReqSize with value set to v.
func NewOptBatchAssetThumbnailsReqSize(v BatchAssetThumbnailsReqSize) OptBatchAssetThumbnailsReqSize {
return OptBatchAssetThumbnailsReqSize{
Value: v,
Set: true,
}
}
// OptBatchAssetThumbnailsReqSize is optional BatchAssetThumbnailsReqSize.
type OptBatchAssetThumbnailsReqSize struct {
Value BatchAssetThumbnailsReqSize
Set bool
}
// IsSet returns true if OptBatchAssetThumbnailsReqSize was set.
func (o OptBatchAssetThumbnailsReqSize) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptBatchAssetThumbnailsReqSize) Reset() {
var v BatchAssetThumbnailsReqSize
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptBatchAssetThumbnailsReqSize) SetTo(v BatchAssetThumbnailsReqSize) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptBatchAssetThumbnailsReqSize) Get() (v BatchAssetThumbnailsReqSize, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptBatchAssetThumbnailsReqSize) Or(d BatchAssetThumbnailsReqSize) BatchAssetThumbnailsReqSize {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptBatchUserThumbnailsOKThumbnails returns new OptBatchUserThumbnailsOKThumbnails with value set to v.
func NewOptBatchUserThumbnailsOKThumbnails(v BatchUserThumbnailsOKThumbnails) OptBatchUserThumbnailsOKThumbnails {
return OptBatchUserThumbnailsOKThumbnails{
Value: v,
Set: true,
}
}
// OptBatchUserThumbnailsOKThumbnails is optional BatchUserThumbnailsOKThumbnails.
type OptBatchUserThumbnailsOKThumbnails struct {
Value BatchUserThumbnailsOKThumbnails
Set bool
}
// IsSet returns true if OptBatchUserThumbnailsOKThumbnails was set.
func (o OptBatchUserThumbnailsOKThumbnails) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptBatchUserThumbnailsOKThumbnails) Reset() {
var v BatchUserThumbnailsOKThumbnails
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptBatchUserThumbnailsOKThumbnails) SetTo(v BatchUserThumbnailsOKThumbnails) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptBatchUserThumbnailsOKThumbnails) Get() (v BatchUserThumbnailsOKThumbnails, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptBatchUserThumbnailsOKThumbnails) Or(d BatchUserThumbnailsOKThumbnails) BatchUserThumbnailsOKThumbnails {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptBatchUserThumbnailsReqSize returns new OptBatchUserThumbnailsReqSize with value set to v.
func NewOptBatchUserThumbnailsReqSize(v BatchUserThumbnailsReqSize) OptBatchUserThumbnailsReqSize {
return OptBatchUserThumbnailsReqSize{
Value: v,
Set: true,
}
}
// OptBatchUserThumbnailsReqSize is optional BatchUserThumbnailsReqSize.
type OptBatchUserThumbnailsReqSize struct {
Value BatchUserThumbnailsReqSize
Set bool
}
// IsSet returns true if OptBatchUserThumbnailsReqSize was set.
func (o OptBatchUserThumbnailsReqSize) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptBatchUserThumbnailsReqSize) Reset() {
var v BatchUserThumbnailsReqSize
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptBatchUserThumbnailsReqSize) SetTo(v BatchUserThumbnailsReqSize) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptBatchUserThumbnailsReqSize) Get() (v BatchUserThumbnailsReqSize, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptBatchUserThumbnailsReqSize) Or(d BatchUserThumbnailsReqSize) BatchUserThumbnailsReqSize {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptBatchUsernamesOKUsernames returns new OptBatchUsernamesOKUsernames with value set to v.
func NewOptBatchUsernamesOKUsernames(v BatchUsernamesOKUsernames) OptBatchUsernamesOKUsernames {
return OptBatchUsernamesOKUsernames{
Value: v,
Set: true,
}
}
// OptBatchUsernamesOKUsernames is optional BatchUsernamesOKUsernames.
type OptBatchUsernamesOKUsernames struct {
Value BatchUsernamesOKUsernames
Set bool
}
// IsSet returns true if OptBatchUsernamesOKUsernames was set.
func (o OptBatchUsernamesOKUsernames) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptBatchUsernamesOKUsernames) Reset() {
var v BatchUsernamesOKUsernames
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptBatchUsernamesOKUsernames) SetTo(v BatchUsernamesOKUsernames) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptBatchUsernamesOKUsernames) Get() (v BatchUsernamesOKUsernames, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptBatchUsernamesOKUsernames) Or(d BatchUsernamesOKUsernames) BatchUsernamesOKUsernames {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptGetAssetThumbnailSize returns new OptGetAssetThumbnailSize with value set to v.
func NewOptGetAssetThumbnailSize(v GetAssetThumbnailSize) OptGetAssetThumbnailSize {
return OptGetAssetThumbnailSize{
Value: v,
Set: true,
}
}
// OptGetAssetThumbnailSize is optional GetAssetThumbnailSize.
type OptGetAssetThumbnailSize struct {
Value GetAssetThumbnailSize
Set bool
}
// IsSet returns true if OptGetAssetThumbnailSize was set.
func (o OptGetAssetThumbnailSize) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptGetAssetThumbnailSize) Reset() {
var v GetAssetThumbnailSize
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptGetAssetThumbnailSize) SetTo(v GetAssetThumbnailSize) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptGetAssetThumbnailSize) Get() (v GetAssetThumbnailSize, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptGetAssetThumbnailSize) Or(d GetAssetThumbnailSize) GetAssetThumbnailSize {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptGetUserThumbnailSize returns new OptGetUserThumbnailSize with value set to v.
func NewOptGetUserThumbnailSize(v GetUserThumbnailSize) OptGetUserThumbnailSize {
return OptGetUserThumbnailSize{
Value: v,
Set: true,
}
}
// OptGetUserThumbnailSize is optional GetUserThumbnailSize.
type OptGetUserThumbnailSize struct {
Value GetUserThumbnailSize
Set bool
}
// IsSet returns true if OptGetUserThumbnailSize was set.
func (o OptGetUserThumbnailSize) IsSet() bool { return o.Set }
// Reset unsets value.
func (o *OptGetUserThumbnailSize) Reset() {
var v GetUserThumbnailSize
o.Value = v
o.Set = false
}
// SetTo sets value to v.
func (o *OptGetUserThumbnailSize) SetTo(v GetUserThumbnailSize) {
o.Set = true
o.Value = v
}
// Get returns value and boolean that denotes whether value was set.
func (o OptGetUserThumbnailSize) Get() (v GetUserThumbnailSize, ok bool) {
if !o.Set {
return v, false
}
return o.Value, true
}
// Or returns value if set, or given parameter if does not.
func (o OptGetUserThumbnailSize) Or(d GetUserThumbnailSize) GetUserThumbnailSize {
if v, ok := o.Get(); ok {
return v
}
return d
}
// NewOptInt32 returns new OptInt32 with value set to v.
func NewOptInt32(v int32) OptInt32 {
return OptInt32{
@@ -1296,12 +1996,92 @@ func (s *ScriptUpdate) SetResourceID(val OptInt64) {
s.ResourceID = val
}
// SeedCombobulatorNoContent is response for SeedCombobulator operation.
type SeedCombobulatorNoContent struct{}
// SetMapfixCompletedNoContent is response for SetMapfixCompleted operation.
type SetMapfixCompletedNoContent struct{}
// SetSubmissionCompletedNoContent is response for SetSubmissionCompleted operation.
type SetSubmissionCompletedNoContent struct{}
// Aggregate statistics for submissions and mapfixes.
// Ref: #/components/schemas/Stats
type Stats struct {
// Total number of submissions.
TotalSubmissions int64 `json:"TotalSubmissions"`
// Total number of mapfixes.
TotalMapfixes int64 `json:"TotalMapfixes"`
// Number of released submissions.
ReleasedSubmissions int64 `json:"ReleasedSubmissions"`
// Number of released mapfixes.
ReleasedMapfixes int64 `json:"ReleasedMapfixes"`
// Number of submissions under review.
SubmittedSubmissions int64 `json:"SubmittedSubmissions"`
// Number of mapfixes under review.
SubmittedMapfixes int64 `json:"SubmittedMapfixes"`
}
// GetTotalSubmissions returns the value of TotalSubmissions.
func (s *Stats) GetTotalSubmissions() int64 {
return s.TotalSubmissions
}
// GetTotalMapfixes returns the value of TotalMapfixes.
func (s *Stats) GetTotalMapfixes() int64 {
return s.TotalMapfixes
}
// GetReleasedSubmissions returns the value of ReleasedSubmissions.
func (s *Stats) GetReleasedSubmissions() int64 {
return s.ReleasedSubmissions
}
// GetReleasedMapfixes returns the value of ReleasedMapfixes.
func (s *Stats) GetReleasedMapfixes() int64 {
return s.ReleasedMapfixes
}
// GetSubmittedSubmissions returns the value of SubmittedSubmissions.
func (s *Stats) GetSubmittedSubmissions() int64 {
return s.SubmittedSubmissions
}
// GetSubmittedMapfixes returns the value of SubmittedMapfixes.
func (s *Stats) GetSubmittedMapfixes() int64 {
return s.SubmittedMapfixes
}
// SetTotalSubmissions sets the value of TotalSubmissions.
func (s *Stats) SetTotalSubmissions(val int64) {
s.TotalSubmissions = val
}
// SetTotalMapfixes sets the value of TotalMapfixes.
func (s *Stats) SetTotalMapfixes(val int64) {
s.TotalMapfixes = val
}
// SetReleasedSubmissions sets the value of ReleasedSubmissions.
func (s *Stats) SetReleasedSubmissions(val int64) {
s.ReleasedSubmissions = val
}
// SetReleasedMapfixes sets the value of ReleasedMapfixes.
func (s *Stats) SetReleasedMapfixes(val int64) {
s.ReleasedMapfixes = val
}
// SetSubmittedSubmissions sets the value of SubmittedSubmissions.
func (s *Stats) SetSubmittedSubmissions(val int64) {
s.SubmittedSubmissions = val
}
// SetSubmittedMapfixes sets the value of SubmittedMapfixes.
func (s *Stats) SetSubmittedMapfixes(val int64) {
s.SubmittedMapfixes = val
}
// Ref: #/components/schemas/Submission
type Submission struct {
ID int64 `json:"ID"`
@@ -1534,6 +2314,23 @@ func (s *Submissions) SetSubmissions(val []Submission) {
s.Submissions = val
}
// UpdateMapfixDescriptionNoContent is response for UpdateMapfixDescription operation.
type UpdateMapfixDescriptionNoContent struct{}
type UpdateMapfixDescriptionReq struct {
Data io.Reader
}
// Read reads data from the Data reader.
//
// Kept to satisfy the io.Reader interface.
func (s UpdateMapfixDescriptionReq) Read(p []byte) (n int, err error) {
if s.Data == nil {
return 0, io.EOF
}
return s.Data.Read(p)
}
// UpdateMapfixModelNoContent is response for UpdateMapfixModel operation.
type UpdateMapfixModelNoContent struct{}

View File

@@ -8,7 +8,6 @@ import (
"strings"
"github.com/go-faster/errors"
"github.com/ogen-go/ogen/ogenerrors"
)
@@ -33,6 +32,7 @@ func findAuthorization(h http.Header, prefix string) (string, bool) {
return "", false
}
// operationRolesCookieAuth is a private map storing roles per operation.
var operationRolesCookieAuth = map[string][]string{
ActionMapfixAcceptedOperation: []string{},
ActionMapfixRejectOperation: []string{},
@@ -58,6 +58,7 @@ var operationRolesCookieAuth = map[string][]string{
ActionSubmissionTriggerUploadOperation: []string{},
ActionSubmissionTriggerValidateOperation: []string{},
ActionSubmissionValidatedOperation: []string{},
CombobulateMapOperation: []string{},
CreateMapfixOperation: []string{},
CreateMapfixAuditCommentOperation: []string{},
CreateScriptOperation: []string{},
@@ -70,17 +71,40 @@ var operationRolesCookieAuth = map[string][]string{
DownloadMapAssetOperation: []string{},
GetOperationOperation: []string{},
ReleaseSubmissionsOperation: []string{},
SeedCombobulatorOperation: []string{},
SessionRolesOperation: []string{},
SessionUserOperation: []string{},
SessionValidateOperation: []string{},
SetMapfixCompletedOperation: []string{},
SetSubmissionCompletedOperation: []string{},
UpdateMapfixDescriptionOperation: []string{},
UpdateMapfixModelOperation: []string{},
UpdateScriptOperation: []string{},
UpdateScriptPolicyOperation: []string{},
UpdateSubmissionModelOperation: []string{},
}
// GetRolesForCookieAuth returns the required roles for the given operation.
//
// This is useful for authorization scenarios where you need to know which roles
// are required for an operation.
//
// Example:
//
// requiredRoles := GetRolesForCookieAuth(AddPetOperation)
//
// Returns nil if the operation has no role requirements or if the operation is unknown.
func GetRolesForCookieAuth(operation string) []string {
roles, ok := operationRolesCookieAuth[operation]
if !ok {
return nil
}
// Return a copy to prevent external modification
result := make([]string, len(roles))
copy(result, roles)
return result
}
func (s *Server) securityCookieAuth(ctx context.Context, operationName OperationName, req *http.Request) (context.Context, bool, error) {
var t CookieAuth
const parameterName = "session_id"

View File

@@ -155,6 +155,30 @@ type Handler interface {
//
// POST /submissions/{SubmissionID}/status/reset-uploading
ActionSubmissionValidated(ctx context.Context, params ActionSubmissionValidatedParams) error
// BatchAssetThumbnails implements batchAssetThumbnails operation.
//
// Batch fetch asset thumbnails.
//
// POST /thumbnails/assets
BatchAssetThumbnails(ctx context.Context, req *BatchAssetThumbnailsReq) (*BatchAssetThumbnailsOK, error)
// BatchUserThumbnails implements batchUserThumbnails operation.
//
// Batch fetch user avatar thumbnails.
//
// POST /thumbnails/users
BatchUserThumbnails(ctx context.Context, req *BatchUserThumbnailsReq) (*BatchUserThumbnailsOK, error)
// BatchUsernames implements batchUsernames operation.
//
// Batch fetch usernames.
//
// POST /usernames
BatchUsernames(ctx context.Context, req *BatchUsernamesReq) (*BatchUsernamesOK, error)
// CombobulateMap implements combobulateMap operation.
//
// Queue a map for combobulator processing.
//
// POST /maps/{MapID}/combobulate
CombobulateMap(ctx context.Context, params CombobulateMapParams) error
// CreateMapfix implements createMapfix operation.
//
// Trigger the validator to create a mapfix.
@@ -215,6 +239,12 @@ type Handler interface {
//
// GET /maps/{MapID}/download
DownloadMapAsset(ctx context.Context, params DownloadMapAssetParams) (DownloadMapAssetOK, error)
// GetAssetThumbnail implements getAssetThumbnail operation.
//
// Get single asset thumbnail.
//
// GET /thumbnails/asset/{AssetID}
GetAssetThumbnail(ctx context.Context, params GetAssetThumbnailParams) (*GetAssetThumbnailFound, error)
// GetMap implements getMap operation.
//
// Retrieve map with ID.
@@ -245,12 +275,24 @@ type Handler interface {
//
// GET /script-policy/{ScriptPolicyID}
GetScriptPolicy(ctx context.Context, params GetScriptPolicyParams) (*ScriptPolicy, error)
// GetStats implements getStats operation.
//
// Get aggregate statistics.
//
// GET /stats
GetStats(ctx context.Context) (*Stats, error)
// GetSubmission implements getSubmission operation.
//
// Retrieve map with ID.
//
// GET /submissions/{SubmissionID}
GetSubmission(ctx context.Context, params GetSubmissionParams) (*Submission, error)
// GetUserThumbnail implements getUserThumbnail operation.
//
// Get single user avatar thumbnail.
//
// GET /thumbnails/user/{UserID}
GetUserThumbnail(ctx context.Context, params GetUserThumbnailParams) (*GetUserThumbnailFound, error)
// ListMapfixAuditEvents implements listMapfixAuditEvents operation.
//
// Retrieve a list of audit events.
@@ -299,6 +341,12 @@ type Handler interface {
//
// POST /release-submissions
ReleaseSubmissions(ctx context.Context, req []ReleaseInfo) (*OperationID, error)
// SeedCombobulator implements seedCombobulator operation.
//
// Queue all maps for combobulator processing.
//
// POST /maps-admin/seed-combobulator
SeedCombobulator(ctx context.Context) error
// SessionRoles implements sessionRoles operation.
//
// Get list of roles for the current session.
@@ -329,6 +377,12 @@ type Handler interface {
//
// POST /submissions/{SubmissionID}/completed
SetSubmissionCompleted(ctx context.Context, params SetSubmissionCompletedParams) error
// UpdateMapfixDescription implements updateMapfixDescription operation.
//
// Update description (submitter only).
//
// PATCH /mapfixes/{MapfixID}/description
UpdateMapfixDescription(ctx context.Context, req UpdateMapfixDescriptionReq, params UpdateMapfixDescriptionParams) error
// UpdateMapfixModel implements updateMapfixModel operation.
//
// Update model following role restrictions.

View File

@@ -232,6 +232,42 @@ func (UnimplementedHandler) ActionSubmissionValidated(ctx context.Context, param
return ht.ErrNotImplemented
}
// BatchAssetThumbnails implements batchAssetThumbnails operation.
//
// Batch fetch asset thumbnails.
//
// POST /thumbnails/assets
func (UnimplementedHandler) BatchAssetThumbnails(ctx context.Context, req *BatchAssetThumbnailsReq) (r *BatchAssetThumbnailsOK, _ error) {
return r, ht.ErrNotImplemented
}
// BatchUserThumbnails implements batchUserThumbnails operation.
//
// Batch fetch user avatar thumbnails.
//
// POST /thumbnails/users
func (UnimplementedHandler) BatchUserThumbnails(ctx context.Context, req *BatchUserThumbnailsReq) (r *BatchUserThumbnailsOK, _ error) {
return r, ht.ErrNotImplemented
}
// BatchUsernames implements batchUsernames operation.
//
// Batch fetch usernames.
//
// POST /usernames
func (UnimplementedHandler) BatchUsernames(ctx context.Context, req *BatchUsernamesReq) (r *BatchUsernamesOK, _ error) {
return r, ht.ErrNotImplemented
}
// CombobulateMap implements combobulateMap operation.
//
// Queue a map for combobulator processing.
//
// POST /maps/{MapID}/combobulate
func (UnimplementedHandler) CombobulateMap(ctx context.Context, params CombobulateMapParams) error {
return ht.ErrNotImplemented
}
// CreateMapfix implements createMapfix operation.
//
// Trigger the validator to create a mapfix.
@@ -322,6 +358,15 @@ func (UnimplementedHandler) DownloadMapAsset(ctx context.Context, params Downloa
return r, ht.ErrNotImplemented
}
// GetAssetThumbnail implements getAssetThumbnail operation.
//
// Get single asset thumbnail.
//
// GET /thumbnails/asset/{AssetID}
func (UnimplementedHandler) GetAssetThumbnail(ctx context.Context, params GetAssetThumbnailParams) (r *GetAssetThumbnailFound, _ error) {
return r, ht.ErrNotImplemented
}
// GetMap implements getMap operation.
//
// Retrieve map with ID.
@@ -367,6 +412,15 @@ func (UnimplementedHandler) GetScriptPolicy(ctx context.Context, params GetScrip
return r, ht.ErrNotImplemented
}
// GetStats implements getStats operation.
//
// Get aggregate statistics.
//
// GET /stats
func (UnimplementedHandler) GetStats(ctx context.Context) (r *Stats, _ error) {
return r, ht.ErrNotImplemented
}
// GetSubmission implements getSubmission operation.
//
// Retrieve map with ID.
@@ -376,6 +430,15 @@ func (UnimplementedHandler) GetSubmission(ctx context.Context, params GetSubmiss
return r, ht.ErrNotImplemented
}
// GetUserThumbnail implements getUserThumbnail operation.
//
// Get single user avatar thumbnail.
//
// GET /thumbnails/user/{UserID}
func (UnimplementedHandler) GetUserThumbnail(ctx context.Context, params GetUserThumbnailParams) (r *GetUserThumbnailFound, _ error) {
return r, ht.ErrNotImplemented
}
// ListMapfixAuditEvents implements listMapfixAuditEvents operation.
//
// Retrieve a list of audit events.
@@ -448,6 +511,15 @@ func (UnimplementedHandler) ReleaseSubmissions(ctx context.Context, req []Releas
return r, ht.ErrNotImplemented
}
// SeedCombobulator implements seedCombobulator operation.
//
// Queue all maps for combobulator processing.
//
// POST /maps-admin/seed-combobulator
func (UnimplementedHandler) SeedCombobulator(ctx context.Context) error {
return ht.ErrNotImplemented
}
// SessionRoles implements sessionRoles operation.
//
// Get list of roles for the current session.
@@ -493,6 +565,15 @@ func (UnimplementedHandler) SetSubmissionCompleted(ctx context.Context, params S
return ht.ErrNotImplemented
}
// UpdateMapfixDescription implements updateMapfixDescription operation.
//
// Update description (submitter only).
//
// PATCH /mapfixes/{MapfixID}/description
func (UnimplementedHandler) UpdateMapfixDescription(ctx context.Context, req UpdateMapfixDescriptionReq, params UpdateMapfixDescriptionParams) error {
return ht.ErrNotImplemented
}
// UpdateMapfixModel implements updateMapfixModel operation.
//
// Update model following role restrictions.

File diff suppressed because it is too large Load Diff

View File

@@ -8,6 +8,8 @@ import (
"git.itzana.me/strafesnet/go-grpc/auth"
"git.itzana.me/strafesnet/go-grpc/maps"
"git.itzana.me/strafesnet/go-grpc/maps_extended"
"git.itzana.me/strafesnet/go-grpc/mapfixes"
"git.itzana.me/strafesnet/go-grpc/submissions"
"git.itzana.me/strafesnet/go-grpc/users"
"git.itzana.me/strafesnet/go-grpc/validator"
"git.itzana.me/strafesnet/maps-service/pkg/api"
@@ -17,7 +19,10 @@ import (
"git.itzana.me/strafesnet/maps-service/pkg/service"
"git.itzana.me/strafesnet/maps-service/pkg/validator_controller"
"git.itzana.me/strafesnet/maps-service/pkg/web_api"
awsconfig "github.com/aws/aws-sdk-go-v2/config"
"github.com/aws/aws-sdk-go-v2/service/s3"
"github.com/nats-io/nats.go"
"github.com/redis/go-redis/v9"
log "github.com/sirupsen/logrus"
"github.com/urfave/cli/v2"
"google.golang.org/grpc"
@@ -102,6 +107,30 @@ func NewServeCommand() *cli.Command {
EnvVars: []string{"RBX_API_KEY"},
Required: true,
},
&cli.StringFlag{
Name: "redis-host",
Usage: "Host of Redis cache",
EnvVars: []string{"REDIS_HOST"},
Value: "localhost:6379",
},
&cli.StringFlag{
Name: "redis-password",
Usage: "Password for Redis",
EnvVars: []string{"REDIS_PASSWORD"},
Value: "",
},
&cli.IntFlag{
Name: "redis-db",
Usage: "Redis database number",
EnvVars: []string{"REDIS_DB"},
Value: 0,
},
&cli.StringFlag{
Name: "s3-bucket",
Usage: "S3 bucket for map assets",
EnvVars: []string{"S3_BUCKET"},
Required: true,
},
},
}
}
@@ -123,12 +152,37 @@ func serve(ctx *cli.Context) error {
_, err = js.AddStream(&nats.StreamConfig{
Name: "maptest",
Subjects: []string{"maptest.>"},
Retention: nats.WorkQueuePolicy,
Retention: nats.InterestPolicy,
})
if err != nil {
log.WithError(err).Fatal("failed to add stream")
}
// Initialize Redis client
redisClient := redis.NewClient(&redis.Options{
Addr: ctx.String("redis-host"),
Password: ctx.String("redis-password"),
DB: ctx.Int("redis-db"),
})
// Test Redis connection
if err := redisClient.Ping(ctx.Context).Err(); err != nil {
log.WithError(err).Warn("failed to connect to Redis - thumbnails will not be cached")
}
// Initialize Roblox client
robloxClient := &roblox.Client{
HttpClient: http.DefaultClient,
ApiKey: ctx.String("rbx-api-key"),
}
// Initialize S3 client
awsCfg, err := awsconfig.LoadDefaultConfig(ctx.Context)
if err != nil {
log.WithError(err).Fatal("failed to load AWS config")
}
s3Client := s3.NewFromConfig(awsCfg)
// connect to main game database
conn, err := grpc.Dial(ctx.String("data-rpc-host"), grpc.WithTransportCredentials(insecure.NewCredentials()))
if err != nil {
@@ -139,13 +193,17 @@ func serve(ctx *cli.Context) error {
js,
maps.NewMapsServiceClient(conn),
users.NewUsersServiceClient(conn),
robloxClient,
redisClient,
s3Client,
ctx.String("s3-bucket"),
)
svc_external := web_api.NewService(
&svc_inner,
roblox.Client{
HttpClient: http.DefaultClient,
ApiKey: ctx.String("rbx-api-key"),
ApiKey: ctx.String("rbx-api-key"),
},
)
@@ -165,7 +223,11 @@ func serve(ctx *cli.Context) error {
grpcServer := grpc.NewServer()
maps_controller := controller.NewMapsController(&svc_inner)
mapfixes_controller := controller.NewMapfixesController(&svc_inner)
submissions_controller := controller.NewSubmissionsController(&svc_inner)
maps_extended.RegisterMapsServiceServer(grpcServer,&maps_controller)
mapfixes.RegisterMapfixesServiceServer(grpcServer,&mapfixes_controller)
submissions.RegisterSubmissionsServiceServer(grpcServer,&submissions_controller)
mapfix_controller := validator_controller.NewMapfixesController(&svc_inner)
operation_controller := validator_controller.NewOperationsController(&svc_inner)

149
pkg/controller/mapfixes.go Normal file
View File

@@ -0,0 +1,149 @@
package controller
import (
"context"
"git.itzana.me/strafesnet/go-grpc/mapfixes"
"git.itzana.me/strafesnet/maps-service/pkg/datastore"
"git.itzana.me/strafesnet/maps-service/pkg/model"
"git.itzana.me/strafesnet/maps-service/pkg/service"
)
type Mapfixes struct {
*mapfixes.UnimplementedMapfixesServiceServer
inner *service.Service
}
func NewMapfixesController(
inner *service.Service,
) Mapfixes {
return Mapfixes{
inner: inner,
}
}
func (svc *Mapfixes) Get(ctx context.Context, request *mapfixes.MapfixId) (*mapfixes.MapfixResponse, error) {
item, err := svc.inner.GetMapfix(ctx, request.ID)
if err != nil {
return nil, err
}
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
return &mapfixes.MapfixResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
TargetAssetID: item.TargetAssetID,
StatusID: mapfixes.MapfixStatus(item.StatusID),
}, nil
}
func (svc *Mapfixes) GetList(ctx context.Context, request *mapfixes.MapfixIdList) (*mapfixes.MapfixList, error) {
items, err := svc.inner.GetMapfixList(ctx, request.ID)
if err != nil {
return nil, err
}
resp := mapfixes.MapfixList{}
resp.Mapfixes = make([]*mapfixes.MapfixResponse, len(items))
for i, item := range items {
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
resp.Mapfixes[i] = &mapfixes.MapfixResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
TargetAssetID: item.TargetAssetID,
StatusID: mapfixes.MapfixStatus(item.StatusID),
}
}
return &resp, nil
}
func (svc *Mapfixes) List(ctx context.Context, request *mapfixes.ListRequest) (*mapfixes.MapfixList, error) {
if request.Page == nil {
return nil, PageError
}
filter := service.NewMapfixFilter()
if request.Filter != nil {
if request.Filter.DisplayName != nil {
filter.SetDisplayName(*request.Filter.DisplayName)
}
if request.Filter.Creator != nil {
filter.SetCreator(*request.Filter.Creator)
}
if request.Filter.GameID != nil {
filter.SetGameID(*request.Filter.GameID)
}
if request.Filter.Submitter != nil {
filter.SetSubmitter(*request.Filter.Submitter)
}
}
items, err := svc.inner.ListMapfixes(ctx, filter, model.Page{
Number: int32(request.Page.Number),
Size: int32(request.Page.Size),
}, datastore.ListSortDateDescending)
if err != nil {
return nil, err
}
resp := mapfixes.MapfixList{}
resp.Mapfixes = make([]*mapfixes.MapfixResponse, len(items))
for i, item := range items {
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
resp.Mapfixes[i] = &mapfixes.MapfixResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
TargetAssetID: item.TargetAssetID,
StatusID: mapfixes.MapfixStatus(item.StatusID),
}
}
return &resp, nil
}

View File

@@ -195,3 +195,13 @@ func (svc *Maps) IncrementLoadCount(ctx context.Context, request *maps_extended.
}
return &maps_extended.NullResponse{}, nil
}
func (svc *Maps) GetSnfmDownloadUrl(ctx context.Context, request *maps_extended.MapId) (*maps_extended.SnfmDownloadUrl, error) {
url, err := svc.inner.GetSnfmDownloadUrl(ctx, request.ID)
if err != nil {
return nil, err
}
return &maps_extended.SnfmDownloadUrl{
Url: url,
}, nil
}

View File

@@ -0,0 +1,161 @@
package controller
import (
"context"
"git.itzana.me/strafesnet/go-grpc/submissions"
"git.itzana.me/strafesnet/maps-service/pkg/datastore"
"git.itzana.me/strafesnet/maps-service/pkg/model"
"git.itzana.me/strafesnet/maps-service/pkg/service"
)
type Submissions struct {
*submissions.UnimplementedSubmissionsServiceServer
inner *service.Service
}
func NewSubmissionsController(
inner *service.Service,
) Submissions {
return Submissions{
inner: inner,
}
}
func (svc *Submissions) Get(ctx context.Context, request *submissions.SubmissionId) (*submissions.SubmissionResponse, error) {
item, err := svc.inner.GetSubmission(ctx, request.ID)
if err != nil {
return nil, err
}
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
var uploaded_asset_id *uint64
if item.UploadedAssetID != 0 {
uploaded_asset_id = &item.UploadedAssetID
}
return &submissions.SubmissionResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
UploadedAssetID: uploaded_asset_id,
StatusID: submissions.SubmissionStatus(item.StatusID),
}, nil
}
func (svc *Submissions) GetList(ctx context.Context, request *submissions.SubmissionIdList) (*submissions.SubmissionList, error) {
items, err := svc.inner.GetSubmissionList(ctx, request.ID)
if err != nil {
return nil, err
}
resp := submissions.SubmissionList{}
resp.Submissions = make([]*submissions.SubmissionResponse, len(items))
for i, item := range items {
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
var uploaded_asset_id *uint64
if item.UploadedAssetID != 0 {
uploaded_asset_id = &item.UploadedAssetID
}
resp.Submissions[i] = &submissions.SubmissionResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
UploadedAssetID: uploaded_asset_id,
StatusID: submissions.SubmissionStatus(item.StatusID),
}
}
return &resp, nil
}
func (svc *Submissions) List(ctx context.Context, request *submissions.ListRequest) (*submissions.SubmissionList, error) {
if request.Page == nil {
return nil, PageError
}
filter := service.NewSubmissionFilter()
if request.Filter != nil {
if request.Filter.DisplayName != nil {
filter.SetDisplayName(*request.Filter.DisplayName)
}
if request.Filter.Creator != nil {
filter.SetCreator(*request.Filter.Creator)
}
if request.Filter.GameID != nil {
filter.SetGameID(*request.Filter.GameID)
}
if request.Filter.Submitter != nil {
filter.SetSubmitter(*request.Filter.Submitter)
}
}
items, err := svc.inner.ListSubmissions(ctx, filter, model.Page{
Number: int32(request.Page.Number),
Size: int32(request.Page.Size),
}, datastore.ListSortDateDescending)
if err != nil {
return nil, err
}
resp := submissions.SubmissionList{}
resp.Submissions = make([]*submissions.SubmissionResponse, len(items))
for i, item := range items {
var validated_asset_id *uint64
if item.ValidatedAssetID != 0 {
validated_asset_id = &item.ValidatedAssetID
}
var validated_asset_version *uint64
if item.ValidatedAssetVersion != 0 {
validated_asset_version = &item.ValidatedAssetVersion
}
var uploaded_asset_id *uint64
if item.UploadedAssetID != 0 {
uploaded_asset_id = &item.UploadedAssetID
}
resp.Submissions[i] = &submissions.SubmissionResponse{
ID: item.ID,
DisplayName: item.DisplayName,
Creator: item.Creator,
GameID: uint32(item.GameID),
CreatedAt: item.CreatedAt.Unix(),
UpdatedAt: item.UpdatedAt.Unix(),
Submitter: uint64(item.Submitter),
AssetVersion: uint64(item.AssetVersion),
AssetID: item.AssetID,
ValidatedAssetID: validated_asset_id,
ValidatedAssetVersion: validated_asset_version,
UploadedAssetID: uploaded_asset_id,
StatusID: submissions.SubmissionStatus(item.StatusID),
}
}
return &resp, nil
}

View File

@@ -47,6 +47,7 @@ type Maps interface {
Create(ctx context.Context, smap model.Map) (model.Map, error)
Update(ctx context.Context, id int64, values OptionalMap) error
Delete(ctx context.Context, id int64) error
GetAll(ctx context.Context) ([]model.Map, error)
List(ctx context.Context, filters OptionalMap, page model.Page) ([]model.Map, error)
IncrementLoadCount(ctx context.Context, id int64) error
}

View File

@@ -23,6 +23,14 @@ func (q OptionalMap) AddNotNil(column string, value interface{}) OptionalMap {
return q
}
func (q OptionalMap) Pop(column string) (interface{}, bool) {
value, ok := q.filter[column]
if ok {
delete(q.filter, column)
}
return value, ok
}
func (q OptionalMap) Map() map[string]interface{} {
return q.filter
}

View File

@@ -74,9 +74,21 @@ func (env *Maps) Delete(ctx context.Context, id int64) error {
return nil
}
func (env *Maps) GetAll(ctx context.Context) ([]model.Map, error) {
var maps []model.Map
if err := env.db.Find(&maps).Error; err != nil {
return nil, err
}
return maps, nil
}
func (env *Maps) List(ctx context.Context, filters datastore.OptionalMap, page model.Page) ([]model.Map, error) {
var events []model.Map
if err := env.db.Where(filters.Map()).Offset(int((page.Number - 1) * page.Size)).Limit(int(page.Size)).Find(&events).Error; err != nil {
tx := env.db.Model(&model.Map{})
if displayName, ok := filters.Pop("display_name"); ok {
tx = tx.Where("display_name ILIKE ?", "%"+displayName.(string)+"%")
}
if err := tx.Where(filters.Map()).Offset(int((page.Number - 1) * page.Size)).Limit(int(page.Size)).Find(&events).Error; err != nil {
return nil, err
}

View File

@@ -91,3 +91,7 @@ type ReleaseMapfixRequest struct {
ModelVersion uint64
TargetAssetID uint64
}
type SeedCombobulatorRequest struct {
AssetID uint64
}

View File

@@ -17,7 +17,7 @@ type ScriptPolicy struct {
// Hash of the source code that leads to this policy.
// If this is a replacement mapping, the original source may not be pointed to by any policy.
// The original source should still exist in the scripts table, which can be located by the same hash.
FromScriptHash int64 // postgres does not support unsigned integers, so we have to pretend
FromScriptHash int64 `gorm:"uniqueIndex"` // postgres does not support unsigned integers, so we have to pretend
// The ID of the replacement source (ScriptPolicyReplace)
// or verbatim source (ScriptPolicyAllowed)
// or 0 (other)

View File

@@ -14,20 +14,19 @@ var (
)
// StrafesNET group roles
type GroupRole int32
type GroupRole int64
var (
// has ScriptWrite
RoleQuat GroupRole = 255
RoleItzaname GroupRole = 254
RoleStagingDeveloper GroupRole = 240
RoleAdmin GroupRole = 3022907289363790132
RoleStagingDeveloper GroupRole = 1755903544954609860
RolesAll Roles = ^RolesEmpty
// has SubmissionUpload
RoleMapAdmin GroupRole = 128
RoleMapAdmin GroupRole = 3407269374075634150
RolesMapAdmin Roles = RolesSubmissionRelease|RolesSubmissionUpload|RolesSubmissionReview|RolesMapCouncil
// has MapfixReview
RoleMapCouncil GroupRole = 64
RoleMapCouncil GroupRole = 607067793768903966
RolesMapCouncil Roles = RolesMapfixReview|RolesMapfixUpload|RolesMapAccess
// access to downloading maps
RoleMapAccess GroupRole = 32
RoleMapAccess GroupRole = 4002786327079647416
RolesMapAccess Roles = RolesMapDownload
)

View File

@@ -26,7 +26,7 @@ func HashParse(hash string) (uint64, error){
type Script struct {
ID int64 `gorm:"primaryKey"`
Name string
Hash int64 // postgres does not support unsigned integers, so we have to pretend
Hash int64 `gorm:"uniqueIndex"` // postgres does not support unsigned integers, so we have to pretend
Source string
ResourceType ResourceType // is this a submission or is it a mapfix
ResourceID int64 // which submission / mapfix did this script first appear in

View File

@@ -1,8 +1,9 @@
package dto
import (
"git.itzana.me/strafesnet/go-grpc/maps_extended"
"time"
"git.itzana.me/strafesnet/go-grpc/maps_extended"
)
type MapFilter struct {

View File

@@ -81,6 +81,47 @@ func (h *MapHandler) Get(ctx *gin.Context) {
})
}
// @Summary Download SNFM file
// @Description Redirects to a signed download URL for a map's SNFM file
// @Tags maps
// @Security ApiKeyAuth
// @Param id path int true "Map ID"
// @Success 307 "Redirect to signed S3 URL"
// @Failure 404 {object} dto.Error "Map not found"
// @Failure default {object} dto.Error "General error response"
// @Router /map/{id}/snfm [get]
func (h *MapHandler) GetSnfmDownloadUrl(ctx *gin.Context) {
id := ctx.Param("id")
mapID, err := strconv.ParseInt(id, 10, 64)
if err != nil {
ctx.JSON(http.StatusBadRequest, dto.Error{
Error: "Invalid map ID format",
})
return
}
resp, err := maps_extended.NewMapsServiceClient(h.mapsClient).GetSnfmDownloadUrl(ctx, &maps_extended.MapId{
ID: mapID,
})
if err != nil {
statusCode := http.StatusInternalServerError
errorMessage := "Failed to get download URL"
if status.Code(err) == codes.NotFound {
statusCode = http.StatusNotFound
errorMessage = "Map not found"
}
ctx.JSON(statusCode, dto.Error{
Error: errorMessage,
})
log.WithError(err).Error("Failed to get SNFM download URL")
return
}
ctx.Redirect(http.StatusTemporaryRedirect, resp.Url)
}
// @Summary List maps
// @Description Get a list of maps
// @Tags maps

View File

@@ -93,6 +93,13 @@ func setupRoutes(cfg *RouterConfig) (*gin.Engine, error) {
v1.GET("/map/:id", mapsHandler.Get)
}
v1Download := public_api.Group("/v1")
{
v1Download.Use(middleware.ValidateRequest("Maps", "Download", cfg.devClient))
v1Download.GET("/map/:id/snfm", mapsHandler.GetSnfmDownloadUrl)
}
// Docs
public_api.GET("/docs/*any", ginSwagger.WrapHandler(swaggerfiles.Handler))
public_api.GET("/", func(ctx *gin.Context) {

160
pkg/roblox/thumbnails.go Normal file
View File

@@ -0,0 +1,160 @@
package roblox
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
)
// ThumbnailSize represents valid Roblox thumbnail sizes
type ThumbnailSize string
const (
Size150x150 ThumbnailSize = "150x150"
Size420x420 ThumbnailSize = "420x420"
Size768x432 ThumbnailSize = "768x432"
)
// ThumbnailFormat represents the image format
type ThumbnailFormat string
const (
FormatPng ThumbnailFormat = "Png"
FormatJpeg ThumbnailFormat = "Jpeg"
)
// ThumbnailRequest represents a single thumbnail request
type ThumbnailRequest struct {
RequestID string `json:"requestId,omitempty"`
Type string `json:"type"`
TargetID uint64 `json:"targetId"`
Size string `json:"size,omitempty"`
Format string `json:"format,omitempty"`
}
// ThumbnailData represents a single thumbnail response
type ThumbnailData struct {
TargetID uint64 `json:"targetId"`
State string `json:"state"` // "Completed", "Error", "Pending"
ImageURL string `json:"imageUrl"`
}
// BatchThumbnailsResponse represents the API response
type BatchThumbnailsResponse struct {
Data []ThumbnailData `json:"data"`
}
// GetAssetThumbnails fetches thumbnails for multiple assets in a single batch request
// Roblox allows up to 100 assets per batch
func (c *Client) GetAssetThumbnails(assetIDs []uint64, size ThumbnailSize, format ThumbnailFormat) ([]ThumbnailData, error) {
if len(assetIDs) == 0 {
return []ThumbnailData{}, nil
}
if len(assetIDs) > 100 {
return nil, GetError("batch size cannot exceed 100 assets")
}
// Build request payload - the API expects an array directly, not wrapped in an object
requests := make([]ThumbnailRequest, len(assetIDs))
for i, assetID := range assetIDs {
requests[i] = ThumbnailRequest{
Type: "Asset",
TargetID: assetID,
Size: string(size),
Format: string(format),
}
}
jsonData, err := json.Marshal(requests)
if err != nil {
return nil, GetError("JSONMarshalError: " + err.Error())
}
req, err := http.NewRequest("POST", "https://thumbnails.roblox.com/v1/batch", bytes.NewBuffer(jsonData))
if err != nil {
return nil, GetError("RequestCreationError: " + err.Error())
}
req.Header.Set("Content-Type", "application/json")
resp, err := c.HttpClient.Do(req)
if err != nil {
return nil, GetError("RequestError: " + err.Error())
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return nil, GetError(fmt.Sprintf("ResponseError: status code %d, body: %s", resp.StatusCode, string(body)))
}
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, GetError("ReadBodyError: " + err.Error())
}
var response BatchThumbnailsResponse
if err := json.Unmarshal(body, &response); err != nil {
return nil, GetError("JSONUnmarshalError: " + err.Error())
}
return response.Data, nil
}
// GetUserAvatarThumbnails fetches avatar thumbnails for multiple users in a single batch request
func (c *Client) GetUserAvatarThumbnails(userIDs []uint64, size ThumbnailSize, format ThumbnailFormat) ([]ThumbnailData, error) {
if len(userIDs) == 0 {
return []ThumbnailData{}, nil
}
if len(userIDs) > 100 {
return nil, GetError("batch size cannot exceed 100 users")
}
// Build request payload - the API expects an array directly, not wrapped in an object
requests := make([]ThumbnailRequest, len(userIDs))
for i, userID := range userIDs {
requests[i] = ThumbnailRequest{
Type: "AvatarHeadShot",
TargetID: userID,
Size: string(size),
Format: string(format),
}
}
jsonData, err := json.Marshal(requests)
if err != nil {
return nil, GetError("JSONMarshalError: " + err.Error())
}
req, err := http.NewRequest("POST", "https://thumbnails.roblox.com/v1/batch", bytes.NewBuffer(jsonData))
if err != nil {
return nil, GetError("RequestCreationError: " + err.Error())
}
req.Header.Set("Content-Type", "application/json")
resp, err := c.HttpClient.Do(req)
if err != nil {
return nil, GetError("RequestError: " + err.Error())
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return nil, GetError(fmt.Sprintf("ResponseError: status code %d, body: %s", resp.StatusCode, string(body)))
}
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, GetError("ReadBodyError: " + err.Error())
}
var response BatchThumbnailsResponse
if err := json.Unmarshal(body, &response); err != nil {
return nil, GetError("JSONUnmarshalError: " + err.Error())
}
return response.Data, nil
}

72
pkg/roblox/users.go Normal file
View File

@@ -0,0 +1,72 @@
package roblox
import (
"bytes"
"encoding/json"
"fmt"
"io"
"net/http"
)
// UserData represents a single user's information
type UserData struct {
ID uint64 `json:"id"`
Name string `json:"name"`
DisplayName string `json:"displayName"`
}
// BatchUsersResponse represents the API response for batch user requests
type BatchUsersResponse struct {
Data []UserData `json:"data"`
}
// GetUsernames fetches usernames for multiple users in a single batch request
// Roblox allows up to 100 users per batch
func (c *Client) GetUsernames(userIDs []uint64) ([]UserData, error) {
if len(userIDs) == 0 {
return []UserData{}, nil
}
if len(userIDs) > 100 {
return nil, GetError("batch size cannot exceed 100 users")
}
// Build request payload
payload := map[string][]uint64{
"userIds": userIDs,
}
jsonData, err := json.Marshal(payload)
if err != nil {
return nil, GetError("JSONMarshalError: " + err.Error())
}
req, err := http.NewRequest("POST", "https://users.roblox.com/v1/users", bytes.NewBuffer(jsonData))
if err != nil {
return nil, GetError("RequestCreationError: " + err.Error())
}
req.Header.Set("Content-Type", "application/json")
resp, err := c.HttpClient.Do(req)
if err != nil {
return nil, GetError("RequestError: " + err.Error())
}
defer resp.Body.Close()
if resp.StatusCode != http.StatusOK {
body, _ := io.ReadAll(resp.Body)
return nil, GetError(fmt.Sprintf("ResponseError: status code %d, body: %s", resp.StatusCode, string(body)))
}
body, err := io.ReadAll(resp.Body)
if err != nil {
return nil, GetError("ReadBodyError: " + err.Error())
}
var response BatchUsersResponse
if err := json.Unmarshal(body, &response); err != nil {
return nil, GetError("JSONUnmarshalError: " + err.Error())
}
return response.Data, nil
}

View File

@@ -103,6 +103,10 @@ func (svc *Service) GetMapfix(ctx context.Context, id int64) (model.Mapfix, erro
return svc.db.Mapfixes().Get(ctx, id)
}
func (svc *Service) GetMapfixList(ctx context.Context, ids []int64) ([]model.Mapfix, error) {
return svc.db.Mapfixes().GetList(ctx, ids)
}
func (svc *Service) UpdateMapfix(ctx context.Context, id int64, pmap MapfixUpdate) error {
return svc.db.Mapfixes().Update(ctx, id, datastore.OptionalMap(pmap))
}

View File

@@ -99,6 +99,10 @@ func (svc *Service) CreateMap(ctx context.Context, item model.Map) (int64, error
return map_item.ID, nil
}
func (svc *Service) GetAllMaps(ctx context.Context) ([]model.Map, error) {
return svc.db.Maps().GetAll(ctx)
}
func (svc *Service) ListMaps(ctx context.Context, filter MapFilter, page model.Page) ([]model.Map, error) {
return svc.db.Maps().List(ctx, datastore.OptionalMap(filter), page)
}

21
pkg/service/nats_maps.go Normal file
View File

@@ -0,0 +1,21 @@
package service
import (
"encoding/json"
"git.itzana.me/strafesnet/maps-service/pkg/model"
)
func (svc *Service) NatsSeedCombobulator(assetID uint64) error {
request := model.SeedCombobulatorRequest{
AssetID: assetID,
}
j, err := json.Marshal(request)
if err != nil {
return err
}
_, err = svc.nats.Publish("maptest.combobulator.seed", j)
return err
}

View File

@@ -1,17 +1,27 @@
package service
import (
"context"
"fmt"
"time"
"git.itzana.me/strafesnet/go-grpc/maps"
"git.itzana.me/strafesnet/go-grpc/users"
"git.itzana.me/strafesnet/maps-service/pkg/datastore"
"git.itzana.me/strafesnet/maps-service/pkg/roblox"
"github.com/aws/aws-sdk-go-v2/service/s3"
"github.com/nats-io/nats.go"
"github.com/redis/go-redis/v9"
)
type Service struct {
db datastore.Datastore
nats nats.JetStreamContext
maps maps.MapsServiceClient
users users.UsersServiceClient
db datastore.Datastore
nats nats.JetStreamContext
maps maps.MapsServiceClient
users users.UsersServiceClient
thumbnailService *ThumbnailService
s3Presign *s3.PresignClient
s3Bucket string
}
func NewService(
@@ -19,11 +29,60 @@ func NewService(
nats nats.JetStreamContext,
maps maps.MapsServiceClient,
users users.UsersServiceClient,
robloxClient *roblox.Client,
redisClient *redis.Client,
s3Client *s3.Client,
s3Bucket string,
) Service {
return Service{
db: db,
nats: nats,
maps: maps,
users: users,
db: db,
nats: nats,
maps: maps,
users: users,
thumbnailService: NewThumbnailService(robloxClient, redisClient),
s3Presign: s3.NewPresignClient(s3Client),
s3Bucket: s3Bucket,
}
}
func (s *Service) GetSnfmDownloadUrl(ctx context.Context, mapID int64) (string, error) {
key := fmt.Sprintf("maps/%d.snfm", mapID)
presigned, err := s.s3Presign.PresignGetObject(ctx, &s3.GetObjectInput{
Bucket: &s.s3Bucket,
Key: &key,
}, s3.WithPresignExpires(5*time.Minute))
if err != nil {
return "", err
}
return presigned.URL, nil
}
// GetAssetThumbnails proxies to the thumbnail service
func (s *Service) GetAssetThumbnails(ctx context.Context, assetIDs []uint64, size roblox.ThumbnailSize) (map[uint64]string, error) {
return s.thumbnailService.GetAssetThumbnails(ctx, assetIDs, size)
}
// GetUserAvatarThumbnails proxies to the thumbnail service
func (s *Service) GetUserAvatarThumbnails(ctx context.Context, userIDs []uint64, size roblox.ThumbnailSize) (map[uint64]string, error) {
return s.thumbnailService.GetUserAvatarThumbnails(ctx, userIDs, size)
}
// GetSingleAssetThumbnail proxies to the thumbnail service
func (s *Service) GetSingleAssetThumbnail(ctx context.Context, assetID uint64, size roblox.ThumbnailSize) (string, error) {
return s.thumbnailService.GetSingleAssetThumbnail(ctx, assetID, size)
}
// GetSingleUserAvatarThumbnail proxies to the thumbnail service
func (s *Service) GetSingleUserAvatarThumbnail(ctx context.Context, userID uint64, size roblox.ThumbnailSize) (string, error) {
return s.thumbnailService.GetSingleUserAvatarThumbnail(ctx, userID, size)
}
// GetUsernames proxies to the thumbnail service
func (s *Service) GetUsernames(ctx context.Context, userIDs []uint64) (map[uint64]string, error) {
return s.thumbnailService.GetUsernames(ctx, userIDs)
}
// GetSingleUsername proxies to the thumbnail service
func (s *Service) GetSingleUsername(ctx context.Context, userID uint64) (string, error) {
return s.thumbnailService.GetSingleUsername(ctx, userID)
}

218
pkg/service/thumbnails.go Normal file
View File

@@ -0,0 +1,218 @@
package service
import (
"context"
"encoding/json"
"fmt"
"time"
"git.itzana.me/strafesnet/maps-service/pkg/roblox"
"github.com/redis/go-redis/v9"
)
type ThumbnailService struct {
robloxClient *roblox.Client
redisClient *redis.Client
cacheTTL time.Duration
}
func NewThumbnailService(robloxClient *roblox.Client, redisClient *redis.Client) *ThumbnailService {
return &ThumbnailService{
robloxClient: robloxClient,
redisClient: redisClient,
cacheTTL: 24 * time.Hour, // Cache thumbnails for 24 hours
}
}
// CachedThumbnail represents a cached thumbnail entry
type CachedThumbnail struct {
ImageURL string `json:"imageUrl"`
State string `json:"state"`
CachedAt time.Time `json:"cachedAt"`
}
// GetAssetThumbnails fetches thumbnails with Redis caching and batching
func (s *ThumbnailService) GetAssetThumbnails(ctx context.Context, assetIDs []uint64, size roblox.ThumbnailSize) (map[uint64]string, error) {
if len(assetIDs) == 0 {
return map[uint64]string{}, nil
}
result := make(map[uint64]string)
var missingIDs []uint64
// Try to get from cache first
for _, assetID := range assetIDs {
cacheKey := fmt.Sprintf("thumbnail:asset:%d:%s", assetID, size)
cached, err := s.redisClient.Get(ctx, cacheKey).Result()
if err == redis.Nil {
// Cache miss
missingIDs = append(missingIDs, assetID)
} else if err != nil {
// Redis error - treat as cache miss
missingIDs = append(missingIDs, assetID)
} else {
// Cache hit
var thumbnail CachedThumbnail
if err := json.Unmarshal([]byte(cached), &thumbnail); err == nil && thumbnail.State == "Completed" {
result[assetID] = thumbnail.ImageURL
} else {
missingIDs = append(missingIDs, assetID)
}
}
}
// If all were cached, return early
if len(missingIDs) == 0 {
return result, nil
}
// Batch fetch missing thumbnails from Roblox API
// Split into batches of 100 (Roblox API limit)
for i := 0; i < len(missingIDs); i += 100 {
end := i + 100
if end > len(missingIDs) {
end = len(missingIDs)
}
batch := missingIDs[i:end]
thumbnails, err := s.robloxClient.GetAssetThumbnails(batch, size, roblox.FormatPng)
if err != nil {
return nil, fmt.Errorf("failed to fetch thumbnails: %w", err)
}
// Process results and cache them
for _, thumb := range thumbnails {
cached := CachedThumbnail{
ImageURL: thumb.ImageURL,
State: thumb.State,
CachedAt: time.Now(),
}
if thumb.State == "Completed" && thumb.ImageURL != "" {
result[thumb.TargetID] = thumb.ImageURL
}
// Cache the result (even if incomplete, to avoid repeated API calls)
cacheKey := fmt.Sprintf("thumbnail:asset:%d:%s", thumb.TargetID, size)
cachedJSON, _ := json.Marshal(cached)
// Use shorter TTL for incomplete thumbnails
ttl := s.cacheTTL
if thumb.State != "Completed" {
ttl = 5 * time.Minute
}
s.redisClient.Set(ctx, cacheKey, cachedJSON, ttl)
}
}
return result, nil
}
// GetUserAvatarThumbnails fetches user avatar thumbnails with Redis caching and batching
func (s *ThumbnailService) GetUserAvatarThumbnails(ctx context.Context, userIDs []uint64, size roblox.ThumbnailSize) (map[uint64]string, error) {
if len(userIDs) == 0 {
return map[uint64]string{}, nil
}
result := make(map[uint64]string)
var missingIDs []uint64
// Try to get from cache first
for _, userID := range userIDs {
cacheKey := fmt.Sprintf("thumbnail:user:%d:%s", userID, size)
cached, err := s.redisClient.Get(ctx, cacheKey).Result()
if err == redis.Nil {
// Cache miss
missingIDs = append(missingIDs, userID)
} else if err != nil {
// Redis error - treat as cache miss
missingIDs = append(missingIDs, userID)
} else {
// Cache hit
var thumbnail CachedThumbnail
if err := json.Unmarshal([]byte(cached), &thumbnail); err == nil && thumbnail.State == "Completed" {
result[userID] = thumbnail.ImageURL
} else {
missingIDs = append(missingIDs, userID)
}
}
}
// If all were cached, return early
if len(missingIDs) == 0 {
return result, nil
}
// Batch fetch missing thumbnails from Roblox API
// Split into batches of 100 (Roblox API limit)
for i := 0; i < len(missingIDs); i += 100 {
end := i + 100
if end > len(missingIDs) {
end = len(missingIDs)
}
batch := missingIDs[i:end]
thumbnails, err := s.robloxClient.GetUserAvatarThumbnails(batch, size, roblox.FormatPng)
if err != nil {
return nil, fmt.Errorf("failed to fetch user thumbnails: %w", err)
}
// Process results and cache them
for _, thumb := range thumbnails {
cached := CachedThumbnail{
ImageURL: thumb.ImageURL,
State: thumb.State,
CachedAt: time.Now(),
}
if thumb.State == "Completed" && thumb.ImageURL != "" {
result[thumb.TargetID] = thumb.ImageURL
}
// Cache the result
cacheKey := fmt.Sprintf("thumbnail:user:%d:%s", thumb.TargetID, size)
cachedJSON, _ := json.Marshal(cached)
// Use shorter TTL for incomplete thumbnails
ttl := s.cacheTTL
if thumb.State != "Completed" {
ttl = 5 * time.Minute
}
s.redisClient.Set(ctx, cacheKey, cachedJSON, ttl)
}
}
return result, nil
}
// GetSingleAssetThumbnail is a convenience method for fetching a single asset thumbnail
func (s *ThumbnailService) GetSingleAssetThumbnail(ctx context.Context, assetID uint64, size roblox.ThumbnailSize) (string, error) {
results, err := s.GetAssetThumbnails(ctx, []uint64{assetID}, size)
if err != nil {
return "", err
}
if url, ok := results[assetID]; ok {
return url, nil
}
return "", fmt.Errorf("thumbnail not available for asset %d", assetID)
}
// GetSingleUserAvatarThumbnail is a convenience method for fetching a single user avatar thumbnail
func (s *ThumbnailService) GetSingleUserAvatarThumbnail(ctx context.Context, userID uint64, size roblox.ThumbnailSize) (string, error) {
results, err := s.GetUserAvatarThumbnails(ctx, []uint64{userID}, size)
if err != nil {
return "", err
}
if url, ok := results[userID]; ok {
return url, nil
}
return "", fmt.Errorf("thumbnail not available for user %d", userID)
}

108
pkg/service/users.go Normal file
View File

@@ -0,0 +1,108 @@
package service
import (
"context"
"encoding/json"
"fmt"
"time"
"git.itzana.me/strafesnet/maps-service/pkg/roblox"
"github.com/redis/go-redis/v9"
)
// CachedUser represents a cached user entry
type CachedUser struct {
Name string `json:"name"`
DisplayName string `json:"displayName"`
CachedAt time.Time `json:"cachedAt"`
}
// GetUsernames fetches usernames with Redis caching and batching
func (s *ThumbnailService) GetUsernames(ctx context.Context, userIDs []uint64) (map[uint64]string, error) {
if len(userIDs) == 0 {
return map[uint64]string{}, nil
}
result := make(map[uint64]string)
var missingIDs []uint64
// Try to get from cache first
for _, userID := range userIDs {
cacheKey := fmt.Sprintf("user:name:%d", userID)
cached, err := s.redisClient.Get(ctx, cacheKey).Result()
if err == redis.Nil {
// Cache miss
missingIDs = append(missingIDs, userID)
} else if err != nil {
// Redis error - treat as cache miss
missingIDs = append(missingIDs, userID)
} else {
// Cache hit
var user CachedUser
if err := json.Unmarshal([]byte(cached), &user); err == nil && user.Name != "" {
result[userID] = user.Name
} else {
missingIDs = append(missingIDs, userID)
}
}
}
// If all were cached, return early
if len(missingIDs) == 0 {
return result, nil
}
// Batch fetch missing usernames from Roblox API
// Split into batches of 100 (Roblox API limit)
for i := 0; i < len(missingIDs); i += 100 {
end := i + 100
if end > len(missingIDs) {
end = len(missingIDs)
}
batch := missingIDs[i:end]
var users []roblox.UserData
var err error
users, err = s.robloxClient.GetUsernames(batch)
if err != nil {
return nil, fmt.Errorf("failed to fetch usernames: %w", err)
}
// Process results and cache them
for _, user := range users {
cached := CachedUser{
Name: user.Name,
DisplayName: user.DisplayName,
CachedAt: time.Now(),
}
if user.Name != "" {
result[user.ID] = user.Name
}
// Cache the result
cacheKey := fmt.Sprintf("user:name:%d", user.ID)
cachedJSON, _ := json.Marshal(cached)
// Cache usernames for a long time (7 days) since they rarely change
s.redisClient.Set(ctx, cacheKey, cachedJSON, 7*24*time.Hour)
}
}
return result, nil
}
// GetSingleUsername is a convenience method for fetching a single username
func (s *ThumbnailService) GetSingleUsername(ctx context.Context, userID uint64) (string, error) {
results, err := s.GetUsernames(ctx, []uint64{userID})
if err != nil {
return "", err
}
if name, ok := results[userID]; ok {
return name, nil
}
return "", fmt.Errorf("username not available for user %d", userID)
}

View File

@@ -2,7 +2,6 @@ package web_api
import (
"context"
"errors"
"fmt"
"io"
"time"
@@ -35,10 +34,10 @@ var(
)
var (
ErrCreationPhaseMapfixesLimit = errors.New("Active mapfixes limited to 20")
ErrActiveMapfixSameTargetAssetID = errors.New("There is an active mapfix for this map already")
ErrCreationPhaseMapfixesLimit = fmt.Errorf("%w: Active mapfixes limited to 20", ErrPermissionDenied)
ErrActiveMapfixSameTargetAssetID = fmt.Errorf("%w: There is an active mapfix for this map already", ErrPermissionDenied)
ErrAcceptOwnMapfix = fmt.Errorf("%w: You cannot accept your own mapfix as the submitter", ErrPermissionDenied)
ErrCreateMapfixRateLimit = errors.New("You must not create more than 5 mapfixes every 10 minutes")
ErrCreateMapfixRateLimit = fmt.Errorf("%w: You must not create more than 5 mapfixes every 10 minutes", ErrTooManyRequests)
)
// POST /mapfixes
@@ -327,6 +326,48 @@ func (svc *Service) UpdateMapfixModel(ctx context.Context, params api.UpdateMapf
)
}
// UpdateMapfixDescription implements updateMapfixDescription operation.
//
// Update description (submitter only, status ChangesRequested or UnderConstruction).
//
// PATCH /mapfixes/{MapfixID}/description
func (svc *Service) UpdateMapfixDescription(ctx context.Context, req api.UpdateMapfixDescriptionReq, params api.UpdateMapfixDescriptionParams) error {
userInfo, ok := ctx.Value("UserInfo").(UserInfoHandle)
if !ok {
return ErrUserInfo
}
// read mapfix
mapfix, err := svc.inner.GetMapfix(ctx, params.MapfixID)
if err != nil {
return err
}
userId, err := userInfo.GetUserID()
if err != nil {
return err
}
// check if caller is the submitter
if userId != mapfix.Submitter {
return ErrPermissionDeniedNotSubmitter
}
// read the new description from request body
data, err := io.ReadAll(req)
if err != nil {
return err
}
newDescription := string(data)
// check if Status is ChangesRequested or UnderConstruction
update := service.NewMapfixUpdate()
update.SetDescription(newDescription)
allow_statuses := []model.MapfixStatus{model.MapfixStatusChangesRequested, model.MapfixStatusUnderConstruction}
return svc.inner.UpdateMapfixIfStatus(ctx, params.MapfixID, allow_statuses, update)
}
// ActionMapfixReject invokes actionMapfixReject operation.
//
// Role Reviewer changes status from Submitted -> Rejected.
@@ -406,7 +447,12 @@ func (svc *Service) ActionMapfixRequestChanges(ctx context.Context, params api.A
target_status := model.MapfixStatusChangesRequested
update := service.NewMapfixUpdate()
update.SetStatusID(target_status)
allow_statuses := []model.MapfixStatus{model.MapfixStatusValidated, model.MapfixStatusAcceptedUnvalidated, model.MapfixStatusSubmitted}
allow_statuses := []model.MapfixStatus{
model.MapfixStatusUploaded,
model.MapfixStatusValidated,
model.MapfixStatusAcceptedUnvalidated,
model.MapfixStatusSubmitted,
}
err = svc.inner.UpdateMapfixIfStatus(ctx, params.MapfixID, allow_statuses, update)
if err != nil {
return err
@@ -512,7 +558,11 @@ func (svc *Service) ActionMapfixTriggerSubmit(ctx context.Context, params api.Ac
target_status := model.MapfixStatusSubmitting
update := service.NewMapfixUpdate()
update.SetStatusID(target_status)
allow_statuses := []model.MapfixStatus{model.MapfixStatusUnderConstruction, model.MapfixStatusChangesRequested}
allow_statuses := []model.MapfixStatus{
model.MapfixStatusUnderConstruction,
model.MapfixStatusChangesRequested,
model.MapfixStatusSubmitted,
}
err = svc.inner.UpdateMapfixIfStatus(ctx, params.MapfixID, allow_statuses, update)
if err != nil {
return err

View File

@@ -86,6 +86,61 @@ func (svc *Service) GetMap(ctx context.Context, params api.GetMapParams) (*api.M
}, nil
}
// SeedCombobulator implements seedCombobulator operation.
//
// Queue all maps for combobulator processing.
//
// POST /maps-admin/seed-combobulator
func (svc *Service) SeedCombobulator(ctx context.Context) error {
userInfo, ok := ctx.Value("UserInfo").(UserInfoHandle)
if !ok {
return ErrUserInfo
}
has_role, err := userInfo.HasRoleSubmissionRelease()
if err != nil {
return err
}
if !has_role {
return ErrPermissionDeniedNeedRoleSubmissionRelease
}
maps, err := svc.inner.GetAllMaps(ctx)
if err != nil {
return err
}
for _, m := range maps {
if err := svc.inner.NatsSeedCombobulator(uint64(m.ID)); err != nil {
return err
}
}
return nil
}
// CombobulateMap implements combobulateMap operation.
//
// Queue a map for combobulator processing.
//
// POST /maps/{MapID}/combobulate
func (svc *Service) CombobulateMap(ctx context.Context, params api.CombobulateMapParams) error {
userInfo, ok := ctx.Value("UserInfo").(UserInfoHandle)
if !ok {
return ErrUserInfo
}
has_role, err := userInfo.HasRoleSubmissionRelease()
if err != nil {
return err
}
if !has_role {
return ErrPermissionDeniedNeedRoleSubmissionRelease
}
return svc.inner.NatsSeedCombobulator(uint64(params.MapID));
}
// DownloadMapAsset invokes downloadMapAsset operation.
//
// Download the map asset.

View File

@@ -36,10 +36,28 @@ func (svc *Service) CreateScript(ctx context.Context, req *api.ScriptCreate) (*a
return nil, err
}
hash := int64(model.HashSource(req.Source))
// Check if a script with this hash already exists
filter := service.NewScriptFilter()
filter.SetHash(hash)
existingScripts, err := svc.inner.ListScripts(ctx, filter, model.Page{Number: 1, Size: 1})
if err != nil {
return nil, err
}
// If script with this hash exists, return existing script ID
if len(existingScripts) > 0 {
return &api.ScriptID{
ScriptID: existingScripts[0].ID,
}, nil
}
// Create new script
script, err := svc.inner.CreateScript(ctx, model.Script{
ID: 0,
Name: req.Name,
Hash: int64(model.HashSource(req.Source)),
Hash: hash,
Source: req.Source,
ResourceType: model.ResourceType(req.ResourceType),
ResourceID: req.ResourceID.Or(0),

View File

@@ -2,7 +2,7 @@ package web_api
import (
"context"
"errors"
"fmt"
"git.itzana.me/strafesnet/go-grpc/auth"
"git.itzana.me/strafesnet/maps-service/pkg/api"
@@ -11,9 +11,9 @@ import (
var (
// ErrMissingSessionID there is no session id
ErrMissingSessionID = errors.New("SessionID missing")
ErrMissingSessionID = fmt.Errorf("%w: SessionID missing", ErrUserInfo)
// ErrInvalidSession caller does not have a valid session
ErrInvalidSession = errors.New("Session invalid")
ErrInvalidSession = fmt.Errorf("%w: Session invalid", ErrUserInfo)
)
type UserInfoHandle struct {
@@ -79,7 +79,7 @@ func (usr UserInfoHandle) GetRoles() (model.Roles, error) {
rolesBitflag := model.RolesEmpty;
for _, r := range roles.Roles {
switch model.GroupRole(r.Rank){
case model.RoleQuat, model.RoleItzaname, model.RoleStagingDeveloper:
case model.RoleAdmin, model.RoleStagingDeveloper:
rolesBitflag|=model.RolesAll
case model.RoleMapAdmin:
rolesBitflag|=model.RolesMapAdmin

View File

@@ -12,6 +12,8 @@ import (
)
var (
ErrBadRequest = errors.New("Bad request")
ErrTooManyRequests = errors.New("Too many requests")
// ErrPermissionDenied caller does not have the required role
ErrPermissionDenied = errors.New("Permission denied")
// ErrUserInfo user info is missing for some reason
@@ -26,7 +28,7 @@ var (
ErrPermissionDeniedNeedRoleMapDownload = fmt.Errorf("%w: Need Role MapDownload", ErrPermissionDenied)
ErrPermissionDeniedNeedRoleScriptWrite = fmt.Errorf("%w: Need Role ScriptWrite", ErrPermissionDenied)
ErrPermissionDeniedNeedRoleMaptest = fmt.Errorf("%w: Need Role Maptest", ErrPermissionDenied)
ErrNegativeID = errors.New("A negative ID was provided")
ErrNegativeID = fmt.Errorf("%w: A negative ID was provided", ErrBadRequest)
)
type Service struct {
@@ -49,14 +51,20 @@ func NewService(
// Used for common default response.
func (svc *Service) NewError(ctx context.Context, err error) *api.ErrorStatusCode {
status := 500
if errors.Is(err, datastore.ErrNotExist) {
status = 404
if errors.Is(err, ErrBadRequest) {
status = 400
}
if errors.Is(err, ErrUserInfo) {
status = 401
}
if errors.Is(err, ErrPermissionDenied) {
status = 403
}
if errors.Is(err, ErrUserInfo) {
status = 401
if errors.Is(err, datastore.ErrNotExist) {
status = 404
}
if errors.Is(err, ErrTooManyRequests) {
status = 429
}
return &api.ErrorStatusCode{
StatusCode: status,

105
pkg/web_api/stats.go Normal file
View File

@@ -0,0 +1,105 @@
package web_api
import (
"context"
"git.itzana.me/strafesnet/maps-service/pkg/api"
"git.itzana.me/strafesnet/maps-service/pkg/datastore"
"git.itzana.me/strafesnet/maps-service/pkg/model"
"git.itzana.me/strafesnet/maps-service/pkg/service"
)
// GET /stats
func (svc *Service) GetStats(ctx context.Context) (*api.Stats, error) {
// Get total submissions count
totalSubmissions, _, err := svc.inner.ListSubmissionsWithTotal(ctx, service.NewSubmissionFilter(), model.Page{
Number: 1,
Size: 0, // We only want the count, not the items
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
// Get total mapfixes count
totalMapfixes, _, err := svc.inner.ListMapfixesWithTotal(ctx, service.NewMapfixFilter(), model.Page{
Number: 1,
Size: 0, // We only want the count, not the items
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
// Get released submissions count
releasedSubmissionsFilter := service.NewSubmissionFilter()
releasedSubmissionsFilter.SetStatuses([]model.SubmissionStatus{model.SubmissionStatusReleased})
releasedSubmissions, _, err := svc.inner.ListSubmissionsWithTotal(ctx, releasedSubmissionsFilter, model.Page{
Number: 1,
Size: 0,
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
// Get released mapfixes count
releasedMapfixesFilter := service.NewMapfixFilter()
releasedMapfixesFilter.SetStatuses([]model.MapfixStatus{model.MapfixStatusReleased})
releasedMapfixes, _, err := svc.inner.ListMapfixesWithTotal(ctx, releasedMapfixesFilter, model.Page{
Number: 1,
Size: 0,
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
// Get submitted submissions count (under review)
submittedSubmissionsFilter := service.NewSubmissionFilter()
submittedSubmissionsFilter.SetStatuses([]model.SubmissionStatus{
model.SubmissionStatusUnderConstruction,
model.SubmissionStatusChangesRequested,
model.SubmissionStatusSubmitting,
model.SubmissionStatusSubmitted,
model.SubmissionStatusAcceptedUnvalidated,
model.SubmissionStatusValidating,
model.SubmissionStatusValidated,
model.SubmissionStatusUploading,
model.SubmissionStatusUploaded,
})
submittedSubmissions, _, err := svc.inner.ListSubmissionsWithTotal(ctx, submittedSubmissionsFilter, model.Page{
Number: 1,
Size: 0,
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
// Get submitted mapfixes count (under review)
submittedMapfixesFilter := service.NewMapfixFilter()
submittedMapfixesFilter.SetStatuses([]model.MapfixStatus{
model.MapfixStatusUnderConstruction,
model.MapfixStatusChangesRequested,
model.MapfixStatusSubmitting,
model.MapfixStatusSubmitted,
model.MapfixStatusAcceptedUnvalidated,
model.MapfixStatusValidating,
model.MapfixStatusValidated,
model.MapfixStatusUploading,
model.MapfixStatusUploaded,
model.MapfixStatusReleasing,
})
submittedMapfixes, _, err := svc.inner.ListMapfixesWithTotal(ctx, submittedMapfixesFilter, model.Page{
Number: 1,
Size: 0,
}, datastore.ListSortDisabled)
if err != nil {
return nil, err
}
return &api.Stats{
TotalSubmissions: totalSubmissions,
TotalMapfixes: totalMapfixes,
ReleasedSubmissions: releasedSubmissions,
ReleasedMapfixes: releasedMapfixes,
SubmittedSubmissions: submittedSubmissions,
SubmittedMapfixes: submittedMapfixes,
}, nil
}

View File

@@ -2,7 +2,6 @@ package web_api
import (
"context"
"errors"
"fmt"
"io"
"time"
@@ -26,12 +25,13 @@ var(
)
var (
ErrCreationPhaseSubmissionsLimit = errors.New("Active submissions limited to 20")
ErrUploadedAssetIDAlreadyExists = errors.New("The submission UploadedAssetID is already set")
ErrReleaseInvalidStatus = errors.New("Only submissions with Uploaded status can be released")
ErrReleaseNoUploadedAssetID = errors.New("Only submissions with a UploadedAssetID can be released")
ErrCreationPhaseSubmissionsLimit = fmt.Errorf("%w: Active submissions limited to 20", ErrPermissionDenied)
ErrUploadedAssetIDAlreadyExists = fmt.Errorf("%w: The submission UploadedAssetID is already set", ErrPermissionDenied)
ErrReleaseInvalidStatus = fmt.Errorf("%w: Only submissions with Uploaded status can be released", ErrPermissionDenied)
ErrReleaseNoUploadedAssetID = fmt.Errorf("%w: Only submissions with a UploadedAssetID can be released", ErrPermissionDenied)
ErrAcceptOwnSubmission = fmt.Errorf("%w: You cannot accept your own submission as the submitter", ErrPermissionDenied)
ErrCreateSubmissionRateLimit = errors.New("You must not create more than 5 submissions every 10 minutes")
ErrCreateSubmissionRateLimit = fmt.Errorf("%w: You must not create more than 5 submissions every 10 minutes", ErrTooManyRequests)
ErrDisplayNameNotUnique = fmt.Errorf("%w: Cannot submit: A map exists with the same DisplayName", ErrPermissionDenied)
)
// POST /submissions
@@ -437,7 +437,12 @@ func (svc *Service) ActionSubmissionRequestChanges(ctx context.Context, params a
target_status := model.SubmissionStatusChangesRequested
update := service.NewSubmissionUpdate()
update.SetStatusID(target_status)
allowed_statuses := []model.SubmissionStatus{model.SubmissionStatusValidated, model.SubmissionStatusAcceptedUnvalidated, model.SubmissionStatusSubmitted}
allowed_statuses := []model.SubmissionStatus{
model.SubmissionStatusUploaded,
model.SubmissionStatusValidated,
model.SubmissionStatusAcceptedUnvalidated,
model.SubmissionStatusSubmitted,
}
err = svc.inner.UpdateSubmissionIfStatus(ctx, params.SubmissionID, allowed_statuses, update)
if err != nil {
return err
@@ -547,11 +552,33 @@ func (svc *Service) ActionSubmissionTriggerSubmit(ctx context.Context, params ap
}
}
// check for maps with the exact same name
filter := service.NewMapFilter()
filter.SetDisplayName(submission.DisplayName)
maps_list, err := svc.inner.ListMaps(
ctx,
filter,
model.Page{
Number: 1,
Size: 1,
},
)
if err != nil {
return err
}
if len(maps_list) != 0 {
return ErrDisplayNameNotUnique
}
// transaction
target_status := model.SubmissionStatusSubmitting
update := service.NewSubmissionUpdate()
update.SetStatusID(target_status)
allowed_statuses := []model.SubmissionStatus{model.SubmissionStatusUnderConstruction, model.SubmissionStatusChangesRequested}
allowed_statuses := []model.SubmissionStatus{
model.SubmissionStatusUnderConstruction,
model.SubmissionStatusChangesRequested,
model.SubmissionStatusSubmitted,
}
err = svc.inner.UpdateSubmissionIfStatus(ctx, params.SubmissionID, allowed_statuses, update)
if err != nil {
return err

135
pkg/web_api/thumbnails.go Normal file
View File

@@ -0,0 +1,135 @@
package web_api
import (
"context"
"strconv"
"git.itzana.me/strafesnet/maps-service/pkg/api"
"git.itzana.me/strafesnet/maps-service/pkg/roblox"
)
// BatchAssetThumbnails handles batch fetching of asset thumbnails
func (svc *Service) BatchAssetThumbnails(ctx context.Context, req *api.BatchAssetThumbnailsReq) (*api.BatchAssetThumbnailsOK, error) {
if len(req.AssetIds) == 0 {
return &api.BatchAssetThumbnailsOK{
Thumbnails: api.NewOptBatchAssetThumbnailsOKThumbnails(map[string]string{}),
}, nil
}
// Convert size string to enum
size := roblox.Size420x420
if req.Size.IsSet() {
sizeStr := req.Size.Value
switch api.BatchAssetThumbnailsReqSize(sizeStr) {
case api.BatchAssetThumbnailsReqSize150x150:
size = roblox.Size150x150
case api.BatchAssetThumbnailsReqSize768x432:
size = roblox.Size768x432
}
}
// Fetch thumbnails from service
thumbnails, err := svc.inner.GetAssetThumbnails(ctx, req.AssetIds, size)
if err != nil {
return nil, err
}
// Convert map[uint64]string to map[string]string for JSON
result := make(map[string]string, len(thumbnails))
for assetID, url := range thumbnails {
result[strconv.FormatUint(assetID, 10)] = url
}
return &api.BatchAssetThumbnailsOK{
Thumbnails: api.NewOptBatchAssetThumbnailsOKThumbnails(result),
}, nil
}
// GetAssetThumbnail handles single asset thumbnail fetch (with redirect)
func (svc *Service) GetAssetThumbnail(ctx context.Context, params api.GetAssetThumbnailParams) (*api.GetAssetThumbnailFound, error) {
// Convert size string to enum
size := roblox.Size420x420
if params.Size.IsSet() {
sizeStr := params.Size.Value
switch api.GetAssetThumbnailSize(sizeStr) {
case api.GetAssetThumbnailSize150x150:
size = roblox.Size150x150
case api.GetAssetThumbnailSize768x432:
size = roblox.Size768x432
}
}
// Fetch thumbnail
thumbnailURL, err := svc.inner.GetSingleAssetThumbnail(ctx, params.AssetID, size)
if err != nil {
return nil, err
}
// Return redirect response
return &api.GetAssetThumbnailFound{
Location: api.NewOptString(thumbnailURL),
}, nil
}
// BatchUserThumbnails handles batch fetching of user avatar thumbnails
func (svc *Service) BatchUserThumbnails(ctx context.Context, req *api.BatchUserThumbnailsReq) (*api.BatchUserThumbnailsOK, error) {
if len(req.UserIds) == 0 {
return &api.BatchUserThumbnailsOK{
Thumbnails: api.NewOptBatchUserThumbnailsOKThumbnails(map[string]string{}),
}, nil
}
// Convert size string to enum
size := roblox.Size150x150
if req.Size.IsSet() {
sizeStr := req.Size.Value
switch api.BatchUserThumbnailsReqSize(sizeStr) {
case api.BatchUserThumbnailsReqSize420x420:
size = roblox.Size420x420
case api.BatchUserThumbnailsReqSize768x432:
size = roblox.Size768x432
}
}
// Fetch thumbnails from service
thumbnails, err := svc.inner.GetUserAvatarThumbnails(ctx, req.UserIds, size)
if err != nil {
return nil, err
}
// Convert map[uint64]string to map[string]string for JSON
result := make(map[string]string, len(thumbnails))
for userID, url := range thumbnails {
result[strconv.FormatUint(userID, 10)] = url
}
return &api.BatchUserThumbnailsOK{
Thumbnails: api.NewOptBatchUserThumbnailsOKThumbnails(result),
}, nil
}
// GetUserThumbnail handles single user avatar thumbnail fetch (with redirect)
func (svc *Service) GetUserThumbnail(ctx context.Context, params api.GetUserThumbnailParams) (*api.GetUserThumbnailFound, error) {
// Convert size string to enum
size := roblox.Size150x150
if params.Size.IsSet() {
sizeStr := params.Size.Value
switch api.GetUserThumbnailSize(sizeStr) {
case api.GetUserThumbnailSize420x420:
size = roblox.Size420x420
case api.GetUserThumbnailSize768x432:
size = roblox.Size768x432
}
}
// Fetch thumbnail
thumbnailURL, err := svc.inner.GetSingleUserAvatarThumbnail(ctx, params.UserID, size)
if err != nil {
return nil, err
}
// Return redirect response
return &api.GetUserThumbnailFound{
Location: api.NewOptString(thumbnailURL),
}, nil
}

33
pkg/web_api/users.go Normal file
View File

@@ -0,0 +1,33 @@
package web_api
import (
"context"
"strconv"
"git.itzana.me/strafesnet/maps-service/pkg/api"
)
// BatchUsernames handles batch fetching of usernames
func (svc *Service) BatchUsernames(ctx context.Context, req *api.BatchUsernamesReq) (*api.BatchUsernamesOK, error) {
if len(req.UserIds) == 0 {
return &api.BatchUsernamesOK{
Usernames: api.NewOptBatchUsernamesOKUsernames(map[string]string{}),
}, nil
}
// Fetch usernames from service
usernames, err := svc.inner.GetUsernames(ctx, req.UserIds)
if err != nil {
return nil, err
}
// Convert map[uint64]string to map[string]string for JSON
result := make(map[string]string, len(usernames))
for userID, username := range usernames {
result[strconv.FormatUint(userID, 10)] = username
}
return &api.BatchUsernamesOK{
Usernames: api.NewOptBatchUsernamesOKUsernames(result),
}, nil
}

View File

@@ -17,7 +17,7 @@ reqwest = { version = "0", features = [
# default features
"charset", "http2", "system-proxy"
], default-features = false }
serde = { version = "1", features = ["derive"] }
serde_json = "1"
serde.workspace = true
serde_json.workspace = true
serde_repr = "0.1.19"
url = "2"

View File

@@ -30,7 +30,6 @@ impl<Items> std::error::Error for SingleItemError<Items> where Items:std::fmt::D
pub type ScriptSingleItemError=SingleItemError<Vec<ScriptID>>;
pub type ScriptPolicySingleItemError=SingleItemError<Vec<ScriptPolicyID>>;
#[allow(dead_code)]
#[derive(Debug)]
pub struct UrlAndBody{
pub url:url::Url,
@@ -76,7 +75,7 @@ pub enum GameID{
FlyTrials=5,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct CreateMapfixRequest<'a>{
pub OperationID:OperationID,
@@ -89,13 +88,13 @@ pub struct CreateMapfixRequest<'a>{
pub TargetAssetID:u64,
pub Description:&'a str,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct MapfixIDResponse{
pub MapfixID:MapfixID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct CreateSubmissionRequest<'a>{
pub OperationID:OperationID,
@@ -108,7 +107,7 @@ pub struct CreateSubmissionRequest<'a>{
pub Status:u32,
pub Roles:u32,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct SubmissionIDResponse{
pub SubmissionID:SubmissionID,
@@ -127,11 +126,11 @@ pub enum ResourceType{
Submission=2,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct GetScriptRequest{
pub ScriptID:ScriptID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct GetScriptsRequest<'a>{
pub Page:u32,
@@ -151,7 +150,7 @@ pub struct GetScriptsRequest<'a>{
pub struct HashRequest<'a>{
pub hash:&'a str,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct ScriptResponse{
pub ID:ScriptID,
@@ -161,7 +160,7 @@ pub struct ScriptResponse{
pub ResourceType:ResourceType,
pub ResourceID:ResourceID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct CreateScriptRequest<'a>{
pub Name:&'a str,
@@ -170,7 +169,7 @@ pub struct CreateScriptRequest<'a>{
#[serde(skip_serializing_if="Option::is_none")]
pub ResourceID:Option<ResourceID>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct ScriptIDResponse{
pub ScriptID:ScriptID,
@@ -186,11 +185,11 @@ pub enum Policy{
Replace=4,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct GetScriptPolicyRequest{
pub ScriptPolicyID:ScriptPolicyID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct GetScriptPoliciesRequest<'a>{
pub Page:u32,
@@ -202,7 +201,7 @@ pub struct GetScriptPoliciesRequest<'a>{
#[serde(skip_serializing_if="Option::is_none")]
pub Policy:Option<Policy>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct ScriptPolicyResponse{
pub ID:ScriptPolicyID,
@@ -210,20 +209,20 @@ pub struct ScriptPolicyResponse{
pub ToScriptID:ScriptID,
pub Policy:Policy
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct CreateScriptPolicyRequest{
pub FromScriptID:ScriptID,
pub ToScriptID:ScriptID,
pub Policy:Policy,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct ScriptPolicyIDResponse{
pub ScriptPolicyID:ScriptPolicyID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct UpdateScriptPolicyRequest{
pub ID:ScriptPolicyID,
@@ -235,7 +234,7 @@ pub struct UpdateScriptPolicyRequest{
pub Policy:Option<Policy>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct UpdateSubmissionModelRequest{
pub SubmissionID:SubmissionID,
@@ -276,7 +275,7 @@ pub enum MapfixStatus{
Released=10,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct GetMapfixesRequest<'a>{
pub Page:u32,
@@ -292,7 +291,7 @@ pub struct GetMapfixesRequest<'a>{
pub StatusID:Option<MapfixStatus>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize,serde::Deserialize)]
pub struct MapfixResponse{
pub ID:MapfixID,
@@ -312,7 +311,7 @@ pub struct MapfixResponse{
pub Description:String,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct MapfixesResponse{
pub Total:u64,
@@ -342,7 +341,7 @@ pub enum SubmissionStatus{
Released=10,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct GetSubmissionsRequest<'a>{
pub Page:u32,
@@ -358,7 +357,7 @@ pub struct GetSubmissionsRequest<'a>{
pub StatusID:Option<SubmissionStatus>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct SubmissionResponse{
pub ID:SubmissionID,
@@ -376,14 +375,14 @@ pub struct SubmissionResponse{
pub StatusID:SubmissionStatus,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct SubmissionsResponse{
pub Total:u64,
pub Submissions:Vec<SubmissionResponse>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct GetMapsRequest<'a>{
pub Page:u32,
@@ -394,7 +393,7 @@ pub struct GetMapsRequest<'a>{
pub GameID:Option<GameID>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct MapResponse{
pub ID:i64,
@@ -404,7 +403,7 @@ pub struct MapResponse{
pub Date:i64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct GetMapfixAuditEventsRequest{
pub Page:u32,
@@ -412,7 +411,7 @@ pub struct GetMapfixAuditEventsRequest{
pub MapfixID:i64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct GetSubmissionAuditEventsRequest{
pub Page:u32,
@@ -420,7 +419,6 @@ pub struct GetSubmissionAuditEventsRequest{
pub SubmissionID:i64,
}
#[allow(nonstandard_style)]
#[derive(Clone,Debug,serde_repr::Deserialize_repr)]
#[repr(u32)]
pub enum AuditEventType{
@@ -475,7 +473,6 @@ pub struct AuditEventCheckList{
pub check_list:Vec<AuditEventCheck>,
}
#[allow(nonstandard_style)]
#[derive(Clone,Debug)]
pub enum AuditEventData{
Action(AuditEventAction),
@@ -491,7 +488,7 @@ pub enum AuditEventData{
#[derive(Clone,Copy,Debug,Hash,Eq,PartialEq,serde::Serialize,serde::Deserialize)]
pub struct AuditEventID(pub(crate)i64);
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct AuditEventReponse{
pub ID:AuditEventID,
@@ -518,7 +515,7 @@ impl AuditEventReponse{
}
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct Check{
pub Name:&'static str,
@@ -526,7 +523,7 @@ pub struct Check{
pub Passed:bool,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionSubmissionSubmittedRequest{
pub SubmissionID:SubmissionID,
@@ -536,33 +533,33 @@ pub struct ActionSubmissionSubmittedRequest{
pub GameID:GameID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionSubmissionRequestChangesRequest{
pub SubmissionID:SubmissionID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionSubmissionUploadedRequest{
pub SubmissionID:SubmissionID,
pub UploadedAssetID:u64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionSubmissionAcceptedRequest{
pub SubmissionID:SubmissionID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct CreateSubmissionAuditErrorRequest{
pub SubmissionID:SubmissionID,
pub ErrorMessage:String,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct CreateSubmissionAuditCheckListRequest<'a>{
pub SubmissionID:SubmissionID,
@@ -580,7 +577,7 @@ impl SubmissionID{
}
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct UpdateMapfixModelRequest{
pub MapfixID:MapfixID,
@@ -588,7 +585,7 @@ pub struct UpdateMapfixModelRequest{
pub ModelVersion:u64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionMapfixSubmittedRequest{
pub MapfixID:MapfixID,
@@ -598,32 +595,32 @@ pub struct ActionMapfixSubmittedRequest{
pub GameID:GameID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionMapfixRequestChangesRequest{
pub MapfixID:MapfixID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionMapfixUploadedRequest{
pub MapfixID:MapfixID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionMapfixAcceptedRequest{
pub MapfixID:MapfixID,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct CreateMapfixAuditErrorRequest{
pub MapfixID:MapfixID,
pub ErrorMessage:String,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct CreateMapfixAuditCheckListRequest<'a>{
pub MapfixID:MapfixID,
@@ -641,7 +638,7 @@ impl MapfixID{
}
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug)]
pub struct ActionOperationFailedRequest{
pub OperationID:OperationID,
@@ -668,7 +665,7 @@ impl Resource{
}
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Serialize)]
pub struct ReleaseInfo{
pub SubmissionID:SubmissionID,
@@ -678,7 +675,7 @@ pub struct ReleaseInfo{
pub struct ReleaseRequest<'a>{
pub schedule:&'a [ReleaseInfo],
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(Clone,Debug,serde::Deserialize)]
pub struct OperationIDResponse{
pub OperationID:OperationID,

View File

@@ -4,18 +4,18 @@ version = "0.1.1"
edition = "2024"
[dependencies]
async-nats = "0.45.0"
futures = "0.3.31"
rbx_asset = { version = "0.5.0", features = ["gzip", "rustls-tls"], default-features = false, registry = "strafesnet" }
rbx_binary = "2.0.0"
rbx_dom_weak = "4.0.0"
async-nats.workspace = true
futures-util.workspace = true
rbx_asset.workspace = true
rbx_binary.workspace = true
rbx_dom_weak.workspace = true
rbx_reflection_database = "2.0.1"
rbx_xml = "2.0.0"
regex = { version = "1.11.3", default-features = false }
serde = { version = "1.0.215", features = ["derive"] }
serde_json = "1.0.133"
serde.workspace = true
serde_json.workspace = true
siphasher = "1.0.1"
tokio = { version = "1.41.1", features = ["macros", "rt-multi-thread", "signal"] }
tokio.workspace = true
heck = "0.5.0"
rust-grpc = { version = "1.6.1", registry = "strafesnet" }
tonic = "0.14.1"

View File

@@ -1,3 +1,4 @@
FROM alpine:3.21 AS runtime
COPY /target/x86_64-unknown-linux-musl/release/maps-validation /
FROM debian:trixie-slim AS runtime
RUN apt-get update && apt-get install -y --no-install-recommends libssl3t64 ca-certificates && rm -rf /var/lib/apt/lists/*
COPY /target/release/maps-validation /
ENTRYPOINT ["/maps-validation"]

View File

@@ -6,7 +6,7 @@ use heck::{ToSnakeCase,ToTitleCase};
use rbx_dom_weak::Instance;
use rust_grpc::validator::Check;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ModelInfoDownload(rbx_asset::cloud::GetError),
@@ -33,7 +33,7 @@ macro_rules! lazy_regex{
}};
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct CheckRequest{
ModelID:u64,
SkipChecks:bool,
@@ -79,7 +79,7 @@ struct ModeElement{
zone:Zone,
mode_id:ModeID,
}
#[allow(dead_code)]
#[expect(dead_code)]
pub enum IDParseError{
NoCaptures,
ParseInt(core::num::ParseIntError),
@@ -324,25 +324,24 @@ pub fn get_model_info<'a>(dom:&'a rbx_dom_weak::WeakDom,model_instance:&'a rbx_d
}
// check if an observed string matches an expected string
pub struct StringCheck<'a,T,Str>(Result<T,StringCheckContext<'a,Str>>);
pub struct StringCheckContext<'a,Str>{
pub struct StringEquality<'a,Str>{
observed:&'a str,
expected:Str,
}
impl<'a,Str> StringCheckContext<'a,Str>
impl<'a,Str> StringEquality<'a,Str>
where
&'a str:PartialEq<Str>,
{
/// Compute the StringCheck, passing through the provided value on success.
fn check<T>(self,value:T)->StringCheck<'a,T,Str>{
fn check<T>(self,value:T)->Result<T,Self>{
if self.observed==self.expected{
StringCheck(Ok(value))
Ok(value)
}else{
StringCheck(Err(self))
Err(self)
}
}
}
impl<Str:std::fmt::Display> std::fmt::Display for StringCheckContext<'_,Str>{
impl<Str:std::fmt::Display> std::fmt::Display for StringEquality<'_,Str>{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
write!(f,"expected: {}, observed: {}",self.expected,self.observed)
}
@@ -442,7 +441,7 @@ pub struct MapInfoOwned{
pub creator:String,
pub game_id:GameID,
}
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum IntoMapInfoOwnedError{
DisplayName(StringValueError),
@@ -464,19 +463,66 @@ impl TryFrom<MapInfo<'_>> for MapInfoOwned{
struct Exists;
struct Absent;
enum DisplayNameError<'a>{
TitleCase(StringEquality<'a,String>),
Empty(StringEmpty),
TooLong(usize),
StringValue(StringValueError),
}
fn check_display_name<'a>(display_name:Result<&'a str,StringValueError>)->Result<&'a str,DisplayNameError<'a>>{
// DisplayName StringValue can be missing or whatever
let display_name=display_name.map_err(DisplayNameError::StringValue)?;
// DisplayName cannot be ""
let display_name=check_empty(display_name).map_err(DisplayNameError::Empty)?;
// DisplayName cannot exceed 50 characters
if 50<display_name.len(){
return Err(DisplayNameError::TooLong(display_name.len()));
}
// Check title case
let display_name=StringEquality{
observed:display_name,
expected:display_name.to_title_case(),
}.check(display_name).map_err(DisplayNameError::TitleCase)?;
Ok(display_name)
}
enum CreatorError{
Empty(StringEmpty),
TooLong(usize),
StringValue(StringValueError),
}
fn check_creator<'a>(creator:Result<&'a str,StringValueError>)->Result<&'a str,CreatorError>{
// Creator StringValue can be missing or whatever
let creator=creator.map_err(CreatorError::StringValue)?;
// Creator cannot be ""
let creator=check_empty(creator).map_err(CreatorError::Empty)?;
// Creator cannot exceed 50 characters
if 50<creator.len(){
return Err(CreatorError::TooLong(creator.len()));
}
Ok(creator)
}
/// The result of every map check.
struct MapCheck<'a>{
// === METADATA CHECKS ===
// The root must be of class Model
model_class:StringCheck<'a,(),&'static str>,
model_class:Result<(),StringEquality<'a,&'static str>>,
// Model's name must be in snake case
model_name:StringCheck<'a,(),String>,
model_name:Result<(),StringEquality<'a,String>>,
// Map must have a StringValue named DisplayName.
// Value must not be empty, must be in title case.
display_name:Result<Result<StringCheck<'a,&'a str,String>,StringEmpty>,StringValueError>,
display_name:Result<&'a str,DisplayNameError<'a>>,
// Map must have a StringValue named Creator.
// Value must not be empty.
creator:Result<Result<&'a str,StringEmpty>,StringValueError>,
creator:Result<&'a str,CreatorError>,
// The prefix of the model's name must match the game it was submitted for.
// bhop_ for bhop, and surf_ for surf
game_id:Result<GameID,ParseGameIDError>,
@@ -511,27 +557,22 @@ struct MapCheck<'a>{
impl<'a> ModelInfo<'a>{
fn check(self)->MapCheck<'a>{
// Check class is exactly "Model"
let model_class=StringCheckContext{
let model_class=StringEquality{
observed:self.model_class,
expected:"Model",
}.check(());
// Check model name is snake case
let model_name=StringCheckContext{
let model_name=StringEquality{
observed:self.model_name,
expected:self.model_name.to_snake_case(),
}.check(());
// Check display name is not empty and has title case
let display_name=self.map_info.display_name.map(|display_name|{
check_empty(display_name).map(|display_name|StringCheckContext{
observed:display_name,
expected:display_name.to_title_case(),
}.check(display_name))
});
let display_name=check_display_name(self.map_info.display_name);
// Check Creator is not empty
let creator=self.map_info.creator.map(check_empty);
let creator=check_creator(self.map_info.creator);
// Check GameID (model name was prefixed with bhop_ surf_ etc)
let game_id=self.map_info.game_id;
@@ -630,10 +671,10 @@ impl MapCheck<'_>{
fn result(self)->Result<MapInfoOwned,Result<MapCheckList,serde_json::Error>>{
match self{
MapCheck{
model_class:StringCheck(Ok(())),
model_name:StringCheck(Ok(())),
display_name:Ok(Ok(StringCheck(Ok(display_name)))),
creator:Ok(Ok(creator)),
model_class:Ok(()),
model_name:Ok(()),
display_name:Ok(display_name),
creator:Ok(creator),
game_id:Ok(game_id),
mapstart:Ok(Exists),
mode_start_counts:DuplicateCheck(Ok(())),
@@ -737,27 +778,25 @@ macro_rules! summary_format{
impl MapCheck<'_>{
fn itemize(&self)->Result<MapCheckList,serde_json::Error>{
let model_class=match &self.model_class{
StringCheck(Ok(()))=>passed!("ModelClass"),
StringCheck(Err(context))=>summary_format!("ModelClass","Invalid model class: {context}"),
Ok(())=>passed!("ModelClass"),
Err(context)=>summary_format!("ModelClass","Invalid model class: {context}"),
};
let model_name=match &self.model_name{
StringCheck(Ok(()))=>passed!("ModelName"),
StringCheck(Err(context))=>summary_format!("ModelName","Model name must have snake_case: {context}"),
Ok(())=>passed!("ModelName"),
Err(context)=>summary_format!("ModelName","Model name must have snake_case: {context}"),
};
let display_name=match &self.display_name{
Ok(Ok(StringCheck(Ok(_))))=>passed!("DisplayName"),
Ok(Ok(StringCheck(Err(context))))=>summary_format!("DisplayName","DisplayName must have Title Case: {context}"),
Ok(Err(context))=>summary_format!("DisplayName","Invalid DisplayName: {context}"),
Err(StringValueError::ObjectNotFound)=>summary!("DisplayName","Missing DisplayName StringValue".to_owned()),
Err(StringValueError::ValueNotSet)=>summary!("DisplayName","DisplayName Value not set".to_owned()),
Err(StringValueError::NonStringValue)=>summary!("DisplayName","DisplayName Value is not a String".to_owned()),
Ok(_)=>passed!("DisplayName"),
Err(DisplayNameError::TitleCase(context))=>summary_format!("DisplayName","DisplayName must have Title Case: {context}"),
Err(DisplayNameError::Empty(context))=>summary_format!("DisplayName","Invalid DisplayName: {context}"),
Err(DisplayNameError::TooLong(context))=>summary_format!("DisplayName","DisplayName is too long: {context} characters (50 characters max)"),
Err(DisplayNameError::StringValue(context))=>summary_format!("DisplayName","DisplayName StringValue: {context}"),
};
let creator=match &self.creator{
Ok(Ok(_))=>passed!("Creator"),
Ok(Err(context))=>summary_format!("Creator","Invalid Creator: {context}"),
Err(StringValueError::ObjectNotFound)=>summary!("Creator","Missing Creator StringValue".to_owned()),
Err(StringValueError::ValueNotSet)=>summary!("Creator","Creator Value not set".to_owned()),
Err(StringValueError::NonStringValue)=>summary!("Creator","Creator Value is not a String".to_owned()),
Ok(_)=>passed!("Creator"),
Err(CreatorError::Empty(context))=>summary_format!("Creator","Invalid Creator: {context}"),
Err(CreatorError::TooLong(context))=>summary_format!("Creator","Creator is too long: {context} characters (50 characters max)"),
Err(CreatorError::StringValue(context))=>summary_format!("Creator","Creator StringValue: {context}"),
};
let game_id=match &self.game_id{
Ok(_)=>passed!("GameID"),

View File

@@ -1,7 +1,7 @@
use crate::check::CheckListAndVersion;
use crate::nats_types::CheckMapfixRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
Check(crate::check::Error),

View File

@@ -1,7 +1,7 @@
use crate::check::CheckListAndVersion;
use crate::nats_types::CheckSubmissionRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
Check(crate::check::Error),

View File

@@ -1,7 +1,7 @@
use crate::download::download_asset_version;
use crate::rbx_util::{get_root_instance,get_mapinfo,read_dom,MapInfo,ReadDomError,GetRootInstanceError,GameID};
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
CreatorTypeMustBeUser,
@@ -17,11 +17,11 @@ impl std::fmt::Display for Error{
}
impl std::error::Error for Error{}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct CreateRequest{
pub ModelID:u64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct CreateResult{
pub AssetOwner:u64,
pub DisplayName:Option<String>,

View File

@@ -1,7 +1,7 @@
use crate::nats_types::CreateMapfixRequest;
use crate::create::CreateRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
Create(crate::create::Error),

View File

@@ -2,7 +2,7 @@ use crate::nats_types::CreateSubmissionRequest;
use crate::create::CreateRequest;
use crate::rbx_util::GameID;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
Create(crate::create::Error),

View File

@@ -1,4 +1,4 @@
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ModelLocationDownload(rbx_asset::cloud::GetError),

View File

@@ -1,4 +1,4 @@
use futures::StreamExt;
use futures_util::StreamExt;
mod download;
mod grpc;
@@ -22,7 +22,6 @@ mod validator;
mod validate_mapfix;
mod validate_submission;
#[allow(dead_code)]
#[derive(Debug)]
pub enum StartupError{
API(tonic::transport::Error),

View File

@@ -1,4 +1,4 @@
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum HandleMessageError{
Messages(async_nats::jetstream::consumer::pull::MessagesError),

View File

@@ -4,7 +4,7 @@
// Requests are sent from maps-service to validator
// Validation invokes the REST api to update the submissions
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct CreateSubmissionRequest{
// operation_id is passed back in the response message
@@ -18,7 +18,7 @@ pub struct CreateSubmissionRequest{
pub Roles:u32,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct CreateMapfixRequest{
pub OperationID:u32,
@@ -27,7 +27,7 @@ pub struct CreateMapfixRequest{
pub Description:String,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct CheckSubmissionRequest{
pub SubmissionID:u64,
@@ -35,7 +35,7 @@ pub struct CheckSubmissionRequest{
pub SkipChecks:bool,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct CheckMapfixRequest{
pub MapfixID:u64,
@@ -43,7 +43,7 @@ pub struct CheckMapfixRequest{
pub SkipChecks:bool,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ValidateSubmissionRequest{
// submission_id is passed back in the response message
@@ -53,7 +53,7 @@ pub struct ValidateSubmissionRequest{
pub ValidatedModelID:Option<u64>,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ValidateMapfixRequest{
// submission_id is passed back in the response message
@@ -64,7 +64,7 @@ pub struct ValidateMapfixRequest{
}
// Create a new map
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct UploadSubmissionRequest{
pub SubmissionID:u64,
@@ -73,7 +73,7 @@ pub struct UploadSubmissionRequest{
pub ModelName:String,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct UploadMapfixRequest{
pub MapfixID:u64,
@@ -83,7 +83,7 @@ pub struct UploadMapfixRequest{
}
// Release a new map
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ReleaseSubmissionRequest{
pub SubmissionID:u64,
@@ -97,14 +97,14 @@ pub struct ReleaseSubmissionRequest{
pub Submitter:u64,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ReleaseSubmissionsBatchRequest{
pub Submissions:Vec<ReleaseSubmissionRequest>,
pub OperationID:u32,
}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
#[derive(serde::Deserialize)]
pub struct ReleaseMapfixRequest{
pub MapfixID:u64,

View File

@@ -1,4 +1,4 @@
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum ReadDomError{
Binary(rbx_binary::DecodeError),
@@ -79,6 +79,15 @@ pub enum StringValueError{
ValueNotSet,
NonStringValue,
}
impl std::fmt::Display for StringValueError{
fn fmt(&self,f:&mut std::fmt::Formatter<'_>)->std::fmt::Result{
match self{
StringValueError::ObjectNotFound=>write!(f,"Missing StringValue"),
StringValueError::ValueNotSet=>write!(f,"Value not set"),
StringValueError::NonStringValue=>write!(f,"Value is not a String"),
}
}
}
fn string_value(instance:Option<&rbx_dom_weak::Instance>)->Result<&str,StringValueError>{
let instance=instance.ok_or(StringValueError::ObjectNotFound)?;

View File

@@ -1,4 +1,5 @@
use futures::StreamExt;
use futures_util::stream::iter as stream_iter;
use futures_util::StreamExt;
use crate::download::download_asset_version;
use crate::nats_types::ReleaseSubmissionsBatchRequest;
@@ -75,25 +76,27 @@ async fn release_inner(
cloud_context:&rbx_asset::cloud::Context,
cloud_context_luau_execution:&rbx_asset::cloud::Context,
load_asset_version_runtime:&rbx_asset::cloud::LuauSessionLatestRequest,
submissions_service:&crate::grpc::submissions::Service,
submissions:&crate::grpc::submissions::Service,
)->Result<(),InnerError>{
let available_parallelism=std::thread::available_parallelism().map_err(InnerError::Io)?.get();
// set up futures
let submissions=&release_info.Submissions;
// unnecessary allocation :(
let asset_versions:Vec<_> =release_info
.Submissions
.iter()
.map(|submission|rbx_asset::cloud::GetAssetVersionRequest{
asset_id:submission.ModelID,
version:submission.ModelVersion,
})
.enumerate()
.collect();
// fut_download
let fut_download=futures::stream::iter(submissions)
.enumerate()
.map(|(index,submission)|{
let asset_version=rbx_asset::cloud::GetAssetVersionRequest{
asset_id:submission.ModelID,
version:submission.ModelVersion,
};
async move{
let modes=download_fut(cloud_context,asset_version).await;
(index,modes)
}
let fut_download=stream_iter(asset_versions)
.map(|(index,asset_version)|async move{
let modes=download_fut(cloud_context,asset_version).await;
(index,modes)
})
.buffer_unordered(available_parallelism.min(MAX_PARALLEL_DECODE))
.collect::<Vec<(usize,Result<_,DownloadFutError>)>>();
@@ -102,7 +105,7 @@ async fn release_inner(
let fut_load_asset_versions=load_asset_versions(
cloud_context_luau_execution,
load_asset_version_runtime,
submissions.iter().map(|submission|submission.UploadedAssetID),
release_info.Submissions.iter().map(|submission|submission.UploadedAssetID),
);
// execute futures
@@ -111,7 +114,7 @@ async fn release_inner(
let load_asset_versions=load_asset_versions_result.map_err(InnerError::LoadAssetVersions)?;
// sanity check roblox output
if load_asset_versions.len()!=submissions.len(){
if load_asset_versions.len()!=release_info.Submissions.len(){
return Err(InnerError::LoadAssetVersionsListLength);
};
@@ -125,7 +128,7 @@ async fn release_inner(
match result{
Ok(value)=>modes.push(value),
Err(error)=>errors.push(ErrorContext{
submission_id:submissions[index].SubmissionID,
submission_id:release_info.Submissions[index].SubmissionID,
error:error,
}),
}
@@ -135,14 +138,14 @@ async fn release_inner(
}
// concurrently dispatch results
let release_results:Vec<_> =futures::stream::iter(
let release_results:Vec<_> =stream_iter(
release_info
.Submissions
.into_iter()
.zip(modes)
.zip(load_asset_versions)
.map(|((submission,modes),asset_version)|async move{
let result=submissions_service.set_status_released(rust_grpc::validator::SubmissionReleaseRequest{
let result=submissions.set_status_released(rust_grpc::validator::SubmissionReleaseRequest{
submission_id:submission.SubmissionID,
map_create:Some(rust_grpc::maps_extended::MapCreate{
id:submission.UploadedAssetID as i64,
@@ -181,7 +184,7 @@ async fn release_inner(
Ok(())
}
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
UpdateOperation(tonic::Status),

View File

@@ -1,7 +1,7 @@
use crate::download::download_asset_version;
use crate::nats_types::UploadMapfixRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum InnerError{
Download(crate::download::Error),
@@ -43,7 +43,7 @@ async fn upload_inner(
Ok(())
}
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ApiActionMapfixUploaded(tonic::Status),

View File

@@ -1,7 +1,7 @@
use crate::download::download_asset_version;
use crate::nats_types::UploadSubmissionRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum InnerError{
Download(crate::download::Error),
@@ -44,7 +44,7 @@ async fn upload_inner(
Ok(upload_response.AssetId)
}
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ApiActionSubmissionUploaded(tonic::Status),

View File

@@ -1,6 +1,6 @@
use crate::nats_types::ValidateMapfixRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ApiActionMapfixValidate(tonic::Status),

View File

@@ -1,6 +1,6 @@
use crate::nats_types::ValidateSubmissionRequest;
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ApiActionSubmissionValidate(tonic::Status),

View File

@@ -1,4 +1,5 @@
use futures::TryStreamExt;
use futures_util::stream::iter as stream_iter;
use futures_util::TryStreamExt;
use rust_grpc::validator::Policy;
use crate::download::download_asset_version;
@@ -17,7 +18,7 @@ fn hash_source(source:&str)->u64{
std::hash::Hasher::finish(&hasher)
}
#[allow(dead_code)]
#[expect(dead_code)]
#[derive(Debug)]
pub enum Error{
ModelInfoDownload(rbx_asset::cloud::GetError),
@@ -52,7 +53,7 @@ impl std::fmt::Display for Error{
}
impl std::error::Error for Error{}
#[allow(nonstandard_style)]
#[expect(nonstandard_style)]
pub struct ValidateRequest{
pub ModelID:u64,
pub ModelVersion:u64,
@@ -153,7 +154,7 @@ impl crate::message_handler::MessageHandler{
}
// send all script hashes to REST endpoint and retrieve the replacements
futures::stream::iter(script_map.iter_mut().map(Ok))
stream_iter(script_map.iter_mut().map(Ok))
.try_for_each_concurrent(Some(SCRIPT_CONCURRENCY),|(source,NamePolicy{policy,name})|async{
// get the hash
let hash=hash_source(source.as_str());

34
web/.gitignore vendored
View File

@@ -1,24 +1,12 @@
bun.lockb
# dependencies
/node_modules
/.pnp
.pnp.*
.yarn/*
!.yarn/patches
!.yarn/plugins
!.yarn/releases
!.yarn/versions
# testing
/coverage
# next.js
/.next/
/out/
# production
/build
/dist
# misc
.DS_Store
@@ -29,12 +17,22 @@ npm-debug.log*
yarn-debug.log*
yarn-error.log*
# env files (can opt-in for committing if needed)
# env files
.env*
# vercel
.vercel
.env.local
.env.development.local
.env.test.local
.env.production.local
# typescript
*.tsbuildinfo
next-env.d.ts
# editor directories and files
.vscode/*
!.vscode/extensions.json
.idea
*.suo
*.ntvs*
*.njsproj
*.sln
*.sw?

View File

@@ -1,13 +1,29 @@
FROM registry.itzana.me/docker-proxy/oven/bun:1.3.3
# Build stage
FROM registry.itzana.me/docker-proxy/oven/bun:1.3.3 AS builder
WORKDIR /app
COPY package.json bun.lockb* ./
RUN bun install --frozen-lockfile
COPY . .
RUN bun run build
# Release
FROM registry.itzana.me/docker-proxy/nginx:alpine
COPY --from=builder /app/build /usr/share/nginx/html
# Add nginx configuration for SPA routing
RUN echo 'server { \
listen 3000; \
location / { \
root /usr/share/nginx/html; \
index index.html; \
try_files $uri $uri/ /index.html; \
} \
}' > /etc/nginx/conf.d/default.conf
EXPOSE 3000
ENV NEXT_TELEMETRY_DISABLED=1
RUN bun install
RUN bun run build
ENTRYPOINT ["bun", "run", "start"]
CMD ["nginx", "-g", "daemon off;"]

File diff suppressed because it is too large Load Diff

20
web/index.html Normal file
View File

@@ -0,0 +1,20 @@
<!doctype html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<link rel="icon" type="image/x-icon" href="/favicon.ico" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>StrafesNET | Maps</title>
<link rel="preconnect" href="https://fonts.googleapis.com" />
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
<link href="https://fonts.googleapis.com/css2?family=Inter:wght@400;500;600;700&family=Barlow:wght@700&display=swap" rel="stylesheet" />
<style>
* { margin: 0; padding: 0; box-sizing: border-box; }
body { background-color: #09090b; }
</style>
</head>
<body>
<div id="root"></div>
<script type="module" src="/src/main.tsx"></script>
</body>
</html>

View File

@@ -1,16 +0,0 @@
import type { NextConfig } from "next";
const nextConfig: NextConfig = {
distDir: "build",
output: "standalone",
images: {
remotePatterns: [
{
protocol: "https",
hostname: "**.rbxcdn.com",
},
],
},
};
export default nextConfig;

4142
web/package-lock.json generated Normal file

File diff suppressed because it is too large Load Diff

View File

@@ -2,21 +2,24 @@
"name": "map-service-web",
"version": "0.1.0",
"private": true,
"type": "module",
"scripts": {
"dev": "next dev -p 3000 --turbopack",
"build": "next build",
"start": "next start -p 3000",
"lint": "next lint"
"dev": "vite",
"build": "tsc && vite build",
"preview": "vite preview",
"lint": "eslint src --ext ts,tsx"
},
"dependencies": {
"@emotion/react": "^11.14.0",
"@emotion/styled": "^11.14.1",
"@monaco-editor/react": "^4.7.0",
"@mui/icons-material": "^7.3.6",
"@mui/material": "^7.3.6",
"@tanstack/react-query": "^5.90.12",
"date-fns": "^4.1.0",
"next": "^16.0.7",
"react": "^19.2.1",
"react-dom": "^19.2.1",
"react-router-dom": "^7.1.3",
"sass": "^1.94.2"
},
"devDependencies": {
@@ -24,8 +27,9 @@
"@types/node": "^24.10.1",
"@types/react": "^19.2.7",
"@types/react-dom": "^19.2.3",
"@vitejs/plugin-react": "^4.3.4",
"eslint": "^9.39.1",
"eslint-config-next": "16.0.7",
"typescript": "^5.9.3"
"typescript": "^5.9.3",
"vite": "^6.0.7"
}
}

View File

Before

Width:  |  Height:  |  Size: 15 KiB

After

Width:  |  Height:  |  Size: 15 KiB

47
web/src/App.tsx Normal file
View File

@@ -0,0 +1,47 @@
import { Routes, Route } from 'react-router-dom'
import { ThemeProvider, CssBaseline } from '@mui/material'
import { theme } from '@/app/lib/theme'
// Pages
import Home from '@/app/page'
import MapsPage from '@/app/maps/page'
import MapDetailPage from '@/app/maps/[mapId]/page'
import MapFixCreatePage from '@/app/maps/[mapId]/fix/page'
import MapfixesPage from '@/app/mapfixes/page'
import MapfixDetailPage from '@/app/mapfixes/[mapfixId]/page'
import SubmissionsPage from '@/app/submissions/page'
import SubmissionDetailPage from '@/app/submissions/[submissionId]/page'
import SubmitPage from '@/app/submit/page'
import AdminSubmitPage from '@/app/admin-submit/page'
import OperationPage from '@/app/operations/[operationId]/page'
import ReviewerDashboardPage from '@/app/reviewer-dashboard/page'
import UserDashboardPage from '@/app/user-dashboard/page'
import ScriptReviewPage from '@/app/script-review/page'
import NotFound from '@/app/not-found/page'
function App() {
return (
<ThemeProvider theme={theme}>
<CssBaseline />
<Routes>
<Route path="/" element={<Home />} />
<Route path="/maps" element={<MapsPage />} />
<Route path="/maps/:mapId" element={<MapDetailPage />} />
<Route path="/maps/:mapId/fix" element={<MapFixCreatePage />} />
<Route path="/mapfixes" element={<MapfixesPage />} />
<Route path="/mapfixes/:mapfixId" element={<MapfixDetailPage />} />
<Route path="/submissions" element={<SubmissionsPage />} />
<Route path="/submissions/:submissionId" element={<SubmissionDetailPage />} />
<Route path="/submit" element={<SubmitPage />} />
<Route path="/admin-submit" element={<AdminSubmitPage />} />
<Route path="/operations/:operationId" element={<OperationPage />} />
<Route path="/review" element={<ReviewerDashboardPage />} />
<Route path="/dashboard" element={<UserDashboardPage />} />
<Route path="/script-review" element={<ScriptReviewPage />} />
<Route path="*" element={<NotFound />} />
</Routes>
</ThemeProvider>
)
}
export default App

View File

@@ -0,0 +1,34 @@
import { Box } from '@mui/material';
import { surface } from '@/app/lib/colors';
const AnimatedBackground: React.FC = () => {
return (
<Box
sx={{
position: 'fixed',
inset: 0,
zIndex: 0,
overflow: 'hidden',
pointerEvents: 'none',
background: `
radial-gradient(ellipse 80% 60% at 50% -20%, rgba(124, 58, 237, 0.08) 0%, transparent 100%),
radial-gradient(ellipse 60% 40% at 80% 80%, rgba(34, 211, 238, 0.04) 0%, transparent 100%),
${surface.base}
`,
}}
>
<Box
sx={{
position: 'absolute',
inset: 0,
opacity: 0.03,
backgroundImage: `url("data:image/svg+xml,%3Csvg viewBox='0 0 256 256' xmlns='http://www.w3.org/2000/svg'%3E%3Cfilter id='n'%3E%3CfeTurbulence type='fractalNoise' baseFrequency='0.9' numOctaves='4' stitchTiles='stitch'/%3E%3C/filter%3E%3Crect width='100%25' height='100%25' filter='url(%23n)'/%3E%3C/svg%3E")`,
backgroundRepeat: 'repeat',
backgroundSize: '128px 128px',
}}
/>
</Box>
);
};
export default AnimatedBackground;

View File

@@ -1,6 +1,6 @@
import {Box, IconButton, Typography} from "@mui/material";
import {Box, Button, IconButton, Typography} from "@mui/material";
import {useEffect, useRef, useState} from "react";
import Link from "next/link";
import { Link } from "react-router-dom";
import ArrowBackIosNewIcon from "@mui/icons-material/ArrowBackIosNew";
import ArrowForwardIosIcon from "@mui/icons-material/ArrowForwardIos";
import {SubmissionInfo} from "@/app/ts/Submission";
@@ -65,14 +65,22 @@ export function Carousel<T extends CarouselItem>({ title, items, renderItem, vie
return (
<Box mb={6}>
<Box display="flex" justifyContent="space-between" alignItems="center" mb={2}>
<Box display="flex" justifyContent="space-between" alignItems="center" mb={3}>
<Typography variant="h4" component="h2" fontWeight="bold">
{title}
</Typography>
<Link href={viewAllLink} style={{textDecoration: 'none'}}>
<Typography component="span" color="primary">
View All
</Typography>
<Link to={viewAllLink} style={{textDecoration: 'none'}}>
<Button
endIcon={<ArrowForwardIosIcon sx={{ fontSize: '0.875rem' }} />}
sx={{
color: 'primary.main',
'&:hover': {
backgroundColor: 'rgba(167, 139, 250, 0.08)',
},
}}
>
View All
</Button>
</Link>
</Box>
@@ -85,9 +93,12 @@ export function Carousel<T extends CarouselItem>({ title, items, renderItem, vie
transform: 'translateY(-50%)',
zIndex: 2,
backgroundColor: 'background.paper',
boxShadow: 2,
border: '1px solid rgba(167, 139, 250, 0.15)',
boxShadow: '0 4px 12px rgba(0, 0, 0, 0.3)',
'&:hover': {
backgroundColor: 'action.hover',
backgroundColor: 'background.paper',
borderColor: 'rgba(167, 139, 250, 0.3)',
boxShadow: '0 8px 20px rgba(167, 139, 250, 0.2)',
},
visibility: scrollPosition <= 5 ? 'hidden' : 'visible',
}}
@@ -106,7 +117,7 @@ export function Carousel<T extends CarouselItem>({ title, items, renderItem, vie
'&::-webkit-scrollbar': {
display: 'none',
},
gap: '16px', // Fixed 16px gap - using string with px unit to ensure it's absolute
gap: '20px',
padding: '8px 4px',
}}
>
@@ -116,7 +127,7 @@ export function Carousel<T extends CarouselItem>({ title, items, renderItem, vie
sx={{
flex: '0 0 auto',
width: {
xs: '260px', // Fixed width at different breakpoints
xs: '260px',
sm: '280px',
md: '300px'
}
@@ -135,9 +146,12 @@ export function Carousel<T extends CarouselItem>({ title, items, renderItem, vie
transform: 'translateY(-50%)',
zIndex: 2,
backgroundColor: 'background.paper',
boxShadow: 2,
border: '1px solid rgba(167, 139, 250, 0.15)',
boxShadow: '0 4px 12px rgba(0, 0, 0, 0.3)',
'&:hover': {
backgroundColor: 'action.hover',
backgroundColor: 'background.paper',
borderColor: 'rgba(167, 139, 250, 0.3)',
boxShadow: '0 8px 20px rgba(167, 139, 250, 0.2)',
},
visibility: scrollPosition >= maxScroll - 5 ? 'hidden' : 'visible',
}}

View File

@@ -1,13 +1,14 @@
import React from 'react';
import {
Box,
Avatar,
Typography,
Tooltip
Tooltip,
Skeleton
} from "@mui/material";
import PersonIcon from '@mui/icons-material/Person';
import { formatDistanceToNow, format } from "date-fns";
import { AuditEvent, decodeAuditEvent as auditEventMessage } from "@/app/ts/AuditEvent";
import { useUserThumbnail } from "@/app/hooks/useThumbnails";
interface AuditEventItemProps {
event: AuditEvent;
@@ -15,17 +16,44 @@ interface AuditEventItemProps {
}
export default function AuditEventItem({ event, validatorUser }: AuditEventItemProps) {
const isValidator = event.User === validatorUser;
const { thumbnailUrl, isLoading } = useUserThumbnail(isValidator ? undefined : event.User, '150x150');
return (
<Box sx={{ display: 'flex', gap: 2 }}>
<Avatar
src={event.User === validatorUser ? undefined : `/thumbnails/user/${event.User}`}
>
<PersonIcon />
</Avatar>
<Box sx={{
display: 'flex',
gap: 2,
p: 2,
borderRadius: 1
}}>
<Box sx={{ position: 'relative', width: 40, height: 40 }}>
<Skeleton
variant="circular"
sx={{
position: 'absolute',
width: 40,
height: 40,
opacity: isLoading ? 1 : 0,
transition: 'opacity 0.3s ease-in-out',
}}
animation="wave"
/>
<Avatar
src={isValidator ? undefined : (thumbnailUrl || undefined)}
sx={{
width: 40,
height: 40,
opacity: isLoading ? 0 : 1,
transition: 'opacity 0.3s ease-in-out',
}}
>
<PersonIcon />
</Avatar>
</Box>
<Box sx={{ flexGrow: 1 }}>
<Box sx={{ display: 'flex', justifyContent: 'space-between' }}>
<Typography variant="subtitle2">
{event.User === validatorUser ? "Validator" : event.Username || "Unknown"}
{isValidator ? "Validator" : event.Username || "Unknown"}
</Typography>
<DateDisplay date={event.Date} />
</Box>

Some files were not shown because too many files have changed in this diff Show More