I written a script that extract some data from an API and build an Excel file. I'm not a dev, it is my first real program ever writted. I hosted the code on Google Colab.
There is API secret keys in clear. I want to share it with a Google Drive sharing link to people needing to generate the Excel file so that they can execute it. However I would prefer not to include API secret keys in clear in order to avoid accidental sharings outside of the entreprise.
I'm wondering how to hide this... Or how to provide users an alternative methode to execute the file without knowing the passwords. I don't have access to a shared webserver internally to the entreprise.
Regards
CLIENT_KEY = u'*****'
CLIENT_SECRET = u'*****'
BASE_URL = u'*****'
access_token_key = '*****'
access_token_secret = '*****'
print ('Getting user profile...',)
oauth = OAuth(CLIENT_KEY, client_secret=CLIENT_SECRET, resource_owner_key=access_token_key,
resource_owner_secret=access_token_secret)
r = requests.get(url=BASE_URL + '1/user/me/profile', auth=oauth)
print (json.dumps(r.json(), sort_keys=True, indent=4, separators=(',', ': ')))
...
To expand on @Korakot Chaovavanich's answer, here is the step by step of that solution:
- Create a file and save it to google drive with your keys in it. It should look like this:
[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
- Install pydrive
!pip install -U -q PyDrive
- Authenticate for google drive, download & parse the creds file
(Some of this code comes from @wenkesj's answer on this question.)
# Imports
import os
from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive
from google.colab import auth
from oauth2client.client import GoogleCredentials
# Google drive authentication
auth.authenticate_user()
gauth = GoogleAuth()
gauth.credentials = GoogleCredentials.get_application_default()
drive = GoogleDrive(gauth)
# File params
local_save_dir = "/root/.aws"
filename = "credentials"
save_path = "{0}/{1}".format(local_save_dir, filename)
# Choose/create a local (colab) directory to store the data.
local_download_path = os.path.expanduser(local_save_dir)
try:
os.makedirs(local_download_path)
except: pass
drive_list = drive.ListFile().GetList()
f = [x for x in drive_list if x["title"] == filename][0]
print('title: %s, id: %s' % (f['title'], f['id']))
fname = os.path.join(local_download_path, f['title'])
print('downloading to {}'.format(fname))
f_ = drive.CreateFile({'id': f['id']})
f_.GetContentFile(fname)
with open(save_path) as creds:
for i, line in enumerate(creds):
if i == 1:
access_token_key = line.replace("aws_access_key_id=", "").replace("\n", "")
if i == 2:
access_token_secret = line.replace("aws_secret_access_key=", "").replace("\n", "")
Now your AWS keys are in the two variables access_token_key
& access_token_secret
.
Try getpass
. For example:
from getpass import getpass
secret = getpass('Enter the secret value: ')
Then, you can share the notebook and each user can enter a distinct value, which you can then use later in the notebook as a regular Python variable.
You can save the secret key as file on Google Drive. Then read the file into Colab.
Now you can set permission to access the key file in Google Drive. Only you and the people you share the key file can use it.
Update
As @efbbrown suggest, you can create an aws key file and store it in Google Drive, e.g.
[default]
aws_access_key_id=AKIAIOSFODNN7EXAMPLE
aws_secret_access_key=wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY
But now (2020) you don't need pydrive
any more. You can just
- Open the file pane on the left side of Colab.
- Select 'Mount Drive'
- Accept by clicking 'Connect to Google Drive'
- Copy that file to Colab, using the code below.
Default place to store credential is ~/.aws/config
. So you can do this (if your file above is named aws_config
)
!mkdir -p ~/.aws
!cp "/content/drive/My Drive/aws_config" ~/.aws/config