nuke/docs
Jason Cameron 2904ff974b
refactor(ogtags): optimize URL construction and memory allocations (#647)
* refactor(ogtags): optimize URL construction and memory allocations

* test(ogtags): add benchmarks and memory usage tests for OGTagCache

* refactor(ogtags): optimize OGTags subsystem to reduce allocations and improve request runtime by up to 66%

* Update docs/docs/CHANGELOG.md

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com>

* refactor(ogtags): optimize URL string construction to reduce allocations

* Update internal/ogtags/ogtags.go

Co-authored-by: Xe Iaso <me@xeiaso.net>
Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com>

* test(ogtags): add fuzz tests for getTarget and extractOGTags functions

* fix(ogtags): update memory calculation logic

Prev it would say that we had allocated 18pb

=== RUN   TestMemoryUsage
    mem_test.go:107: Memory allocated for 10k getTarget calls: 18014398509481904.00 KB
    mem_test.go:135: Memory allocated for 1k extractOGTags calls: 18014398509481978.00

    Now it's fixed with

    === RUN   TestMemoryUsage
    mem_test.go:109: Memory allocated for 10k getTarget calls:
    mem_test.go:110:   Total: 630.56 KB (0.62 MB)
    mem_test.go:111:   Per operation: 64.57 bytes
    mem_test.go:140: Memory allocated for 1k extractOGTags calls:
    mem_test.go:141:   Total: 328.17 KB (0.32 MB)
    mem_test.go:142:   Per operation: 336.05 bytes

* refactor(ogtags): optimize meta tag extraction for improved performance

* Update metadata

check-spelling run (pull_request) for json/ogmem

Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com>
on-behalf-of: @check-spelling <check-spelling-bot@check-spelling.dev>

* chore: update CHANGELOG for recent optimizations and version bump

* refactor: improve URL construction and meta tag extraction logic

* style:  cleanup fuzz tests

---------

Signed-off-by: Jason Cameron <jasoncameron.all@gmail.com>
Signed-off-by: check-spelling-bot <check-spelling-bot@users.noreply.github.com>
Signed-off-by: Jason Cameron <git@jasoncameron.dev>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Xe Iaso <me@xeiaso.net>
2025-06-13 09:53:10 -04:00
..
docs refactor(ogtags): optimize URL construction and memory allocations (#647) 2025-06-13 09:53:10 -04:00
manifest chore(docs/manifest): try no-js challenge to see how it impacts false positive rate 2025-06-06 21:40:28 -04:00
src fix(docs): make the docs respect light/dark mode (#334) 2025-04-23 04:01:02 +00:00
static chore(sponsors): add Raptor Computing Systems 2025-06-03 17:49:28 -04:00
.dockerignore add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00
.gitignore add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00
Dockerfile Explicitely define image sources in Dockerfile (#21) 2025-03-20 17:28:30 -04:00
docusaurus.config.ts fix(docs): make the docs respect light/dark mode (#334) 2025-04-23 04:01:02 +00:00
package-lock.json build(deps): bump estree-util-value-to-estree in /docs (#336) 2025-04-23 07:09:01 -04:00
package.json add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00
README.md add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00
sidebars.ts add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00
tsconfig.json add docs site based on docusarus (#35) 2025-03-20 15:06:58 -04:00

Website

This website is built using Docusaurus, a modern static website generator.

Installation

$ yarn

Local Development

$ yarn start

This command starts a local development server and opens up a browser window. Most changes are reflected live without having to restart the server.

Build

$ yarn build

This command generates static content into the build directory and can be served using any static contents hosting service.

Deployment

Using SSH:

$ USE_SSH=true yarn deploy

Not using SSH:

$ GIT_USER=<Your GitHub username> yarn deploy

If you are using GitHub pages for hosting, this command is a convenient way to build the website and push to the gh-pages branch.