I never depend on a models built-in training when using third-party libraries. Providing tons of additional context to the model like a skill, example repos, or context7 snippets that I manually curate is more effort up-front and takes longer, but the results are worth it.
Stuff I threw into the inputs before working with pyinfra
I used ansible for years and pyinfra is very approachable since it has similar concepts, like inventories, common operations like files.put, server.shell, loving it so far, and it is quite fast
I get this error too and if I try again: { ... "error":{"type":"permission_error","message":"anthropic.claude-opus-4-7 is not available for this account. You can explore other available models on Amazon Bedrock. For additional access options, contact AWS Sales at https://aws.amazon.com/contact-us/sales-support/"}}
I use it solely to give a webui for managing virtual machines on individual servers.
It hooks into libvirt for this, so I can also manage them via virsh et al, but it's a nice tool to set up the essentials of a VM and provide remote access to the VM console.
Working on https://dataraven.io/ – a low-cost, cloud-agnostic data movement platform focused on object storage. In the past month I've added API keys, audit logs, and rclone.conf import for rapid onboarding.
RClone is doing the heavy lifting of reliable & fast cloud to cloud transfer. I'm wrapping it with the operational features clients have asked me for over the years:
- Team workspaces with role-based access control & complete audit log of all activity
- Notifications – alerts on transfer failure or resource changes via Slack, Teams, json webhook, etc.
- Centralized log storage/archiving
- Bring your Own Vault integrations – connect 1Password, Doppler, or Infisical for zero-knowledge credential handling
- 10 Gbps connected infrastructure for handling large transfers
It supports any OpenAI-compatible API out of the box, so AWS Bedrock, LiteLLM, Ollama, etc. should all work. The free testing LLM is just there for a quick demo. Please bring your own LLM for long-time usage.
https://better-auth.com/docs/plugins/jwt
reply