Managing Multiple Laravel Cloud Apps
I run eight Laravel applications on Laravel Cloud. Each has its own repository, environment variables, and deployment pipeline. Managing them one at a time through the dashboard works when you have two projects. At eight, you need automation. Here's how I handle it.
The API
Laravel Cloud has a REST API that covers most of what the dashboard does. Authentication is a bearer token, and the responses are standard JSON. I store my API token in Bitwarden and pull it as needed.
# List all applications
curl -s "https://cloud.laravel.com/api/applications" \
-H "Authorization: Bearer $CLOUD_TOKEN" \
-H "Accept: application/json" | jq '.[].name'
Environment variables without the dashboard
Adding environment variables through the dashboard means clicking into each app, finding the environment, editing the variables, and saving. With the API, I can push variables to any environment in one command:
# Add a variable without replacing existing ones
curl -X POST "https://cloud.laravel.com/api/environments/$ENV_ID/variables" \
-H "Authorization: Bearer $CLOUD_TOKEN" \
-H "Accept: application/json" \
-H "Content-Type: application/json" \
-d '{"method":"append","variables":[{"key":"NEW_SERVICE_KEY","value":"secret123"}]}'
The method field matters. Use append to add variables without touching existing ones. Use replace to overwrite the entire set — useful for full environment syncs but dangerous if you forget a variable.
Batch dependency updates
Every few days, I run a shell script that updates Composer and NPM dependencies across all projects, runs the test suites in parallel, and commits the changes. The script skips the packages directory — those have their own release cycles.
# The core loop (simplified)
for project in ~/Development/php/laravel/*/; do
cd "$project"
composer update --no-interaction
npm update
php artisan test --parallel
git add -A && git commit -m "chore: update dependencies" && git push
done
Since all my Cloud apps deploy on push to main, this single script triggers deployments across all projects. If tests fail, the commit is skipped and I investigate manually.
The Cloud CLI
For interactive tasks like running artisan commands on a production environment, the Cloud CLI is easier than the API:
# Run a command on production
cloud command:run production --cmd="php artisan migrate --force"
# Check the output
cloud command:get <command-id>
The CLI is pre-authenticated after initial setup, so there's no token juggling. I use the API for automation and the CLI for one-off operations.
Keeping track of IDs
Every Cloud app has an application ID, environment ID, and Nightwatch ID. I keep a lookup table in my global Claude Code configuration so I can reference any project by name:
| Project | Cloud App ID | Cloud Env ID | Nightwatch ID |
|-----------------|---------------|---------------|---------------|
| pulli.dev | app-a0fb... | env-a0fb... | a11c27... |
Each project's CLAUDE.md also has its own IDs embedded. When I'm working in a project and need to check Nightwatch or run a Cloud command, the context is already there.
What I would improve
- No API for deleting individual environment variables. You can append or replace the entire set, but removing a single variable requires the dashboard.
- No webhook notifications for deployment status. I'd love a Discord ping when a deployment succeeds or fails.
- The API is in early access. Endpoints may change, so I keep my automation scripts simple and easy to update.
Despite these gaps, the combination of API, CLI, and git-push deploys covers 95% of what I need. The remaining 5% is dashboard work I do once a quarter at most.