If I had a dollar for every API key inside a config.json…
Here's the thing, config.json should have been on the project's .gitignore.
Not exactly because of credentials. But, how do you change it to test with different settings?
For a lot of my projects, there is a config-.json that is selected at startup based the environment.
Nothing secure in those, however.
But, how do you change it to test with different settings?
When it's really messy, we:
- check in a template file,
- securely share a .env file (and .gitignore it)
- and check in one line script that inflates the real config file (which we also .gitignore).
I actually do have a dollar for every API key I or my team have committed inside a config file.
And...I'm doing pretty well.
Also, I've built some close friendships with our Cybersecurity team.
Can I have a dollar for every public S3 bucket?
Might just make enough to pay your AWS bill this month.
At my workplace, we use the string @nocommit
to designate code that shouldn't be checked in. Usually in a comment:
// @nocommit temporary for testing
apiKey = 'blah';
// apiKey = getKeyFromKeychain();
but it can be anywhere in the file.
There's a lint rule that looks for @nocommit
in all modified files. It shows a lint error in dev and in our code review / build system, and commits that contain @nocommit
anywhere are completely blocked from being merged.
(the code in the lint rule does something like "@no"+"commit"
to avoid triggering itself)
At my workplace, we use the string @nocommit to designate code that shouldn’t be checked in
That approach seems useful but it wouldn't have prevented the PyPI incident OP links to: the access token was temporarily entered in a .py
python source file, but it was not committed to git. The leak was via .pyc
compiled python files which made it into a published docker build.
Yeah, but a combination of this approach, and adding all compiled file types including .pyc to .gitignore would fix it.
adding all compiled file types including .pyc to .gitignore would fix it
But in this case they didn't accidentally put the token in git; the place where they forgot to put *.pyc
was .dockerignore
.
This sounds like a really useful solution, how do you implement something like this? Especially with linter integration
I'm not sure, sorry. The source control team at work set it up a long time ago. I don't know how it works - I'm just a user of it.
The linter probably just runs git diff | grep @nocommit
or similar.
Depending on which stack you’re using, you could use https://danger.systems to automatically fail PRs.
PRs? Isn't the point of @nocommit
that something does not get committed, and therefore no credentials are stored in the git repository? Even if the PR does not get merged, the file is still stored as a hit object and can be restored.
I read the lint part and my brain forgot about everything else. You could stick the danger call in a pre commit hook though.
This is a huge idea. I'm stealing it.
Not just for credentials, there are many times where I change a setting or whatever and just put "//TODO: remember to set it back to '...' before commiting". I forget to change it back 99% of the time.
Neat idea. This could be refined by adding a git hook that runs (rip)grep on the entire codebase and fails if anything is found upon commit may accomplish a similar result and stop the code from being committed entirely. Requires a bit more setup work on de developers end, though.
Would a git hook block you from committing it locally, or would it just run on the server side?
I'm not sure how our one at work is implemented, but we can actually commit @nocommit
files in our local repo, and push them into the code review system. We just can't merge any changes that contain it.
It's used for common workflows like creating new database entities. During development, the ORM system creates a dev database on a test DB cluster and automatically points the code for the new table to it, with a @nocommit
comment above it. When the code is approved, the new schema is pushed to prod and the code is updated to point to the real DB.
Also, the codebase is way too large for something like ripgrep to search the whole codebase in a reasonable time, which is why it only searches the commit diffs themselves.
There are many git hooks. One of them checks each commit, but there's another that triggers on merges.
I also personally ask myself how a PyPI Admin & Director of Infrastructure can miss out on so many basic coding and security relevant aspects:
- Hardcoding credentials and not using dedicated secret files, environment variable or other secret stores
- For any source that you compile you have to assume that - in one way or another - it ends up in the final artifact - Apparently this was not fully understood (".pyc files containing the compiled bytecode weren't considered")
- Not using a isolated build process e.g. a CI with an isolated VM or a container - This will inevitable lead to "works on my machine" scenarios
- Needing the built artifact (containerimage) only locally but pushing it into a publicly available registry
- Using a access token that has full admin permissions for everything, despite only requiring it to bypass rate limits
- Apparently using a single access token for everything
- When you use Git locally and want to push to GitHub you need an access token. The fact that article says "the one and only GitHub access token related to my account" likely indicates that this token was at least also used for this
- One of the takeaways of the article says "set aggressive expiration dates for API tokens" - This won't help much if you don't understand how to handle them properly in the first place. An attacker can still use them before they expire or simply extract updated tokens from newer artifacts.
On the other hand what went well:
- When this was reported it was reacted upon within a few minutes
- Some of my above points of criticism now appear to be taken into account ("Takeaways")
This will inevitable lead to “works on my machine” scenarios
Isn’t that what Python is all about?
I feel seen.
Yes kids, the only stuff in ANY repo (public or otherwise) should be source code.
If it is compiled, built, or otherwise modified by any process outside of you the developer typing in your source code editor, it needs to be excluded/ignored from being committed. No excuses. None. Nope, not even that one.
No. 👏 Excuses. 👏
Two choices: Either the production software isn’t in the exact state the repo was when the software was built. Or I can’t get build timestamps in the software.
This will inevitable lead to "works on my machine" scenarios
Isn't this why Docker exists? It's "works on my machine"-as-a-service.
When you use Git locally and want to push to GitHub you need an access token.
I don't understand; I can push to GitHub using https creds or an ssh key without creating access tokens.
don't commit credentials; split them up and place each part in a different place in the code and use code comments as a treasure map and make them work for it.
Ah, the horcrux technique.
I mean, turns out it is pretty easy actually, Boromir.
On the contrary, one can commit or compile credentials quite simply... Maybe Boromir isn't the right person to ask.
Are you doubting Boromir's programming ability?
I just commit and push the entire contents of my home folder and let people figure it out for themselves
Fun fact: if you search for "removed key" or something similar in GitHub you will get thousands of results of people removing accidentally committed keys. I'm guessing the vast majority of those removed keys haven't been revoked.
This reminds me of that one time when i pushed with my github token as my username (dw i revoked it)
@carrylex git should be password manager aware and refuse to commit if changes include a password
Well from my personal PoV there are a few problems with that
- You can't detect all credentials reliably, they could be encoded in base64 for example
- I think it's kind of okay to commit credentials and configuration used for the local dev environment (and ONLY the local one). E.g. when you require some infrastructure like a database inside a container for your app. Not every dev wants to manually set a few dozen configuration entries when they quickly want to checkout and run the app
You can’t detect all credentials reliably,
Easy. You check in the password file first. Then you can check if the codebase contains any entry on the blacklist.
Wait…
You were so close! The right solution is of course training an AI model that detects credentials and rejects commits that contain them!
You joke, but GitHub advanced security does this and more. On top of the AI component, they check the hash of all things that look like an api key and then also check them against their integrated vendors to see if they’re non-expired. I don’t know how well it works, but they claim like a .1% false positive rate or something like that.
I need one of those reminder bots, so I can share a link to an inevitable startup, six months from now, based on your humorous comment.
I think it's kind of okay to commit credentials and configuration used for the local dev environment (and ONLY the local one).
No. Never.
E.g. when you require some infrastructure like a database inside a container for your app. Not every dev wants to manually set a few dozen configuration entries when they quickly want to checkout and run the app
In this situation, it would be better to write a simple script that can generate fresh and unique values for the dev.
Laziness is not an excuse.
They do. But, as they say,ake it idiot-proof, and someone will make a better idiot.
Github != Git
You’re right. I do that sometimes.
Programmer Humor
Welcome to Programmer Humor!
This is a place where you can post jokes, memes, humor, etc. related to programming!
For sharing awful code theres also Programming Horror.
Rules
- Keep content in english
- No advertisements
- Posts must be related to programming or programmer topics