Auto-format all files using yapf and pre-commit (#1173)

Automatic fixes produced by yapf formatting using `--style=google`, as
well as common pre-commit checks such as trailing whitespace removal,
double quote fixer, and newlines at the end of a file.
This commit is contained in:
ejochman 2020-04-28 13:59:47 -07:00 committed by GitHub
parent 216f2920b9
commit 0bd4eefeca
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
35 changed files with 18967 additions and 14714 deletions

View File

@ -5,7 +5,7 @@ Please confirm the following:
* I am typing the command as described in the GAM Wiki at https://github.com/jay0lee/gam/wiki * I am typing the command as described in the GAM Wiki at https://github.com/jay0lee/gam/wiki
Full steps to reproduce the issue: Full steps to reproduce the issue:
1. 1.
2. 2.
3. 3.

2
.github/stale.yml vendored
View File

@ -33,7 +33,7 @@ staleLabel: wontfix
markComment: > markComment: >
This issue has been automatically marked as stale because it has not had This issue has been automatically marked as stale because it has not had
recent activity. It will be closed if no further activity occurs. recent activity. It will be closed if no further activity occurs.
# Comment to post when removing the stale label. # Comment to post when removing the stale label.
# unmarkComment: > # unmarkComment: >
# Your comment here. # Your comment here.

View File

@ -9,10 +9,12 @@ repos:
hooks: hooks:
- id: trailing-whitespace - id: trailing-whitespace
- id: end-of-file-fixer - id: end-of-file-fixer
- id: double-quote-string-fixer
- id: check-yaml - id: check-yaml
- id: check-docstring-first - id: check-docstring-first
- id: name-tests-test - id: name-tests-test
- id: requirements-txt-fixer - id: requirements-txt-fixer
- id: check-merge-conflict
- repo: https://github.com/pre-commit/mirrors-yapf - repo: https://github.com/pre-commit/mirrors-yapf
rev: v0.29.0 rev: v0.29.0

View File

@ -224,7 +224,7 @@ If an item contains spaces, it should be surrounded by ".
<Section> ::= <String> <Section> ::= <String>
<SerialNumber> ::= <String> <SerialNumber> ::= <String>
<ServiceAccountKey> ::= <String> <ServiceAccountKey> ::= <String>
<S/MIMEID> ::= <String> <S/MIMEID> ::= <String>
<SMTPHostName> ::= <String> <SMTPHostName> ::= <String>
<StudentItem> ::= <EmailAddress>|<UniqueID>|<String> <StudentItem> ::= <EmailAddress>|<UniqueID>|<String>
<TeamDriveID> ::= <String> <TeamDriveID> ::= <String>
@ -773,7 +773,7 @@ Specify a collection of Users by directly specifying them or by specifiying item
(name <String>)| (name <String>)|
(type <String>)| (type <String>)|
(uservisibledescription <String>) (uservisibledescription <String>)
<SchemaFieldDefinition> ::= <SchemaFieldDefinition> ::=
field <FieldName> (type bool|date|double|email|int64|phone|string) [multivalued|multivalue] [indexed] [restricted] [range <Number> <Number>] endfield field <FieldName> (type bool|date|double|email|int64|phone|string) [multivalued|multivalue] [indexed] [restricted] [range <Number> <Number>] endfield
@ -1253,10 +1253,10 @@ gam download export <MatterItem> <ExportItem> [noverify] [noextract] [targetfold
gam create vaulthold|hold corpus drive|groups|mail matter <MatterItem> [name <String>] [query <QueryVaultCorpus>] gam create vaulthold|hold corpus drive|groups|mail matter <MatterItem> [name <String>] [query <QueryVaultCorpus>]
[(accounts|groups|users <EmailItemList>) | (orgunit|ou <OrgUnit>)] [(accounts|groups|users <EmailItemList>) | (orgunit|ou <OrgUnit>)]
[start|starttime <Date>|<Time>] [end|endtime <Date>|<Time>] [start|starttime <Date>|<Time>] [end|endtime <Date>|<Time>]
gam update vaulthold|hold <HoldItem> matter <MatterItem> [query <QueryVaultCorpus>] gam update vaulthold|hold <HoldItem> matter <MatterItem> [query <QueryVaultCorpus>]
[([addaccounts|addgroups|addusers <EmailItemList>] [removeaccounts|removegroups|removeusers <EmailItemList>]) | (orgunit|ou <OrgUnit>)] [([addaccounts|addgroups|addusers <EmailItemList>] [removeaccounts|removegroups|removeusers <EmailItemList>]) | (orgunit|ou <OrgUnit>)]
[start|starttime <Date>|<Time>] [end|endtime <Date>|<Time>] [start|starttime <Date>|<Time>] [end|endtime <Date>|<Time>]
gam delete vaulthold|hold <HoldItem> matter <MatterItem> gam delete vaulthold|hold <HoldItem> matter <MatterItem>
gam info vaulthold|hold <HoldItem> matter <MatterItem> gam info vaulthold|hold <HoldItem> matter <MatterItem>
gam print vaultholds|holds [todrive] [matters <MatterItemList>] gam print vaultholds|holds [todrive] [matters <MatterItemList>]
@ -1369,7 +1369,7 @@ gam <UserTypeEntity> insertemail [recipient|to <EmailAddress>] [from <EmailAddre
[deleted] [date <Time>] [deleted] [date <Time>]
gam <UserTypeEntity> sendemail [recipient|to <EmailAddress>] [from <EmailAddress>] gam <UserTypeEntity> sendemail [recipient|to <EmailAddress>] [from <EmailAddress>]
[subject <String>] [(message <String>)|(file <FileName> [charset <Charset>])] [subject <String>] [(message <String>)|(file <FileName> [charset <Charset>])]
(header <String> <String>)* (header <String> <String>)*
gam <UserTypeEntity> create|add delegate|delegates <EmailAddress> gam <UserTypeEntity> create|add delegate|delegates <EmailAddress>
gam <UserTypeEntity> delegate|delegates to <EmailAddress> gam <UserTypeEntity> delegate|delegates to <EmailAddress>

View File

@ -202,12 +202,12 @@
APACHE HTTP SERVER SUBCOMPONENTS: APACHE HTTP SERVER SUBCOMPONENTS:
The Apache HTTP Server includes a number of subcomponents with The Apache HTTP Server includes a number of subcomponents with
separate copyright notices and license terms. Your use of the source separate copyright notices and license terms. Your use of the source
code for the these subcomponents is subject to the terms and code for the these subcomponents is subject to the terms and
conditions of the following licenses. conditions of the following licenses.
For the mod_mime_magic component: For the mod_mime_magic component:
@ -273,7 +273,7 @@ For the server\util_md5.c component:
* Original Code Copyright (C) 1994, Jeff Hostetler, Spyglass, Inc. * Original Code Copyright (C) 1994, Jeff Hostetler, Spyglass, Inc.
* Portions of Content-MD5 code Copyright (C) 1993, 1994 by Carnegie Mellon * Portions of Content-MD5 code Copyright (C) 1993, 1994 by Carnegie Mellon
* University (see Copyright below). * University (see Copyright below).
* Portions of Content-MD5 code Copyright (C) 1991 Bell Communications * Portions of Content-MD5 code Copyright (C) 1991 Bell Communications
* Research, Inc. (Bellcore) (see Copyright below). * Research, Inc. (Bellcore) (see Copyright below).
* Portions extracted from mpack, John G. Myers - jgm+@cmu.edu * Portions extracted from mpack, John G. Myers - jgm+@cmu.edu
* Content-MD5 Code contributed by Martin Hamilton (martin@net.lut.ac.uk) * Content-MD5 Code contributed by Martin Hamilton (martin@net.lut.ac.uk)
@ -319,10 +319,10 @@ For the server\util_md5.c component:
* of an authorized representative of Bellcore. BELLCORE * of an authorized representative of Bellcore. BELLCORE
* MAKES NO REPRESENTATIONS ABOUT THE ACCURACY OR SUITABILITY * MAKES NO REPRESENTATIONS ABOUT THE ACCURACY OR SUITABILITY
* OF THIS MATERIAL FOR ANY PURPOSE. IT IS PROVIDED "AS IS", * OF THIS MATERIAL FOR ANY PURPOSE. IT IS PROVIDED "AS IS",
* WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES. * WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES.
*/ */
For the srclib\apr\include\apr_md5.h component: For the srclib\apr\include\apr_md5.h component:
/* /*
* This is work is derived from material Copyright RSA Data Security, Inc. * This is work is derived from material Copyright RSA Data Security, Inc.
* *
@ -501,21 +501,21 @@ This program is Copyright (C) Zeus Technology Limited 1996.
This program may be used and copied freely providing this copyright notice This program may be used and copied freely providing this copyright notice
is not removed. is not removed.
This software is provided "as is" and any express or implied waranties, This software is provided "as is" and any express or implied waranties,
including but not limited to, the implied warranties of merchantability and including but not limited to, the implied warranties of merchantability and
fitness for a particular purpose are disclaimed. In no event shall fitness for a particular purpose are disclaimed. In no event shall
Zeus Technology Ltd. be liable for any direct, indirect, incidental, special, Zeus Technology Ltd. be liable for any direct, indirect, incidental, special,
exemplary, or consequential damaged (including, but not limited to, exemplary, or consequential damaged (including, but not limited to,
procurement of substitute good or services; loss of use, data, or profits; procurement of substitute good or services; loss of use, data, or profits;
or business interruption) however caused and on theory of liability. Whether or business interruption) however caused and on theory of liability. Whether
in contract, strict liability or tort (including negligence or otherwise) in contract, strict liability or tort (including negligence or otherwise)
arising in any way out of the use of this software, even if advised of the arising in any way out of the use of this software, even if advised of the
possibility of such damage. possibility of such damage.
Written by Adam Twiss (adam@zeus.co.uk). March 1996 Written by Adam Twiss (adam@zeus.co.uk). March 1996
Thanks to the following people for their input: Thanks to the following people for their input:
Mike Belshe (mbelshe@netscape.com) Mike Belshe (mbelshe@netscape.com)
Michael Campanella (campanella@stevms.enet.dec.com) Michael Campanella (campanella@stevms.enet.dec.com)
*/ */
@ -532,10 +532,10 @@ without limitation the rights to use, copy, modify, merge, publish,
distribute, sublicense, and/or sell copies of the Software, and to distribute, sublicense, and/or sell copies of the Software, and to
permit persons to whom the Software is furnished to do so, subject to permit persons to whom the Software is furnished to do so, subject to
the following conditions: the following conditions:
The above copyright notice and this permission notice shall be included The above copyright notice and this permission notice shall be included
in all copies or substantial portions of the Software. in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND,
EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.

View File

@ -142,7 +142,7 @@
"location": "query" "location": "query"
} }
} }
}, },
"list": { "list": {
"id": "cloudprint.jobs.list", "id": "cloudprint.jobs.list",
"path": "jobs", "path": "jobs",
@ -347,7 +347,7 @@
"type": "boolean", "type": "boolean",
"location": "query" "location": "query"
} }
} }
}, },
"unshare": { "unshare": {
"id": "cloudprint.printers.unshare", "id": "cloudprint.printers.unshare",
@ -479,7 +479,7 @@
"location": "query" "location": "query"
} }
} }
} }
} }
} }
} }

View File

@ -7,5 +7,5 @@ import sys
from gam.__main__ import main from gam.__main__ import main
# Run from command line # Run from command line
if __name__ == "__main__": if __name__ == '__main__':
main(sys.argv) main(sys.argv)

File diff suppressed because it is too large Load Diff

View File

@ -29,6 +29,7 @@ from multiprocessing import set_start_method
from gam import controlflow from gam import controlflow
import gam import gam
def main(argv): def main(argv):
freeze_support() freeze_support()
if sys.platform == 'darwin': if sys.platform == 'darwin':
@ -37,11 +38,13 @@ def main(argv):
# command line arguments # command line arguments
set_start_method('fork') set_start_method('fork')
if sys.version_info[0] < 3 or sys.version_info[1] < 6: if sys.version_info[0] < 3 or sys.version_info[1] < 6:
controlflow.system_error_exit(5, controlflow.system_error_exit(
f'GAM requires Python 3.6 or newer. You are running %s.%s.%s. Please upgrade your Python version or use one of the binary GAM downloads.' % sys.version_info[ 5,
:3]) f'GAM requires Python 3.6 or newer. You are running %s.%s.%s. Please upgrade your Python version or use one of the binary GAM downloads.'
% sys.version_info[:3])
sys.exit(gam.ProcessGAMCommand(sys.argv)) sys.exit(gam.ProcessGAMCommand(sys.argv))
# Run from command line # Run from command line
if __name__ == "__main__": if __name__ == '__main__':
main(sys.argv) main(sys.argv)

View File

@ -32,48 +32,48 @@ MESSAGE_LOCAL_SERVER_SUCCESS = ('The authentication flow has completed. You may'
class CredentialsError(Exception): class CredentialsError(Exception):
"""Base error class.""" """Base error class."""
pass pass
class InvalidCredentialsFileError(CredentialsError): class InvalidCredentialsFileError(CredentialsError):
"""Error raised when a file cannot be opened into a credentials object.""" """Error raised when a file cannot be opened into a credentials object."""
pass pass
class EmptyCredentialsFileError(InvalidCredentialsFileError): class EmptyCredentialsFileError(InvalidCredentialsFileError):
"""Error raised when a credentials file contains no content.""" """Error raised when a credentials file contains no content."""
pass pass
class InvalidClientSecretsFileFormatError(CredentialsError): class InvalidClientSecretsFileFormatError(CredentialsError):
"""Error raised when a client secrets file format is invalid.""" """Error raised when a client secrets file format is invalid."""
pass pass
class InvalidClientSecretsFileError(CredentialsError): class InvalidClientSecretsFileError(CredentialsError):
"""Error raised when client secrets file cannot be read.""" """Error raised when client secrets file cannot be read."""
pass pass
class Credentials(google.oauth2.credentials.Credentials): class Credentials(google.oauth2.credentials.Credentials):
"""Google OAuth2.0 Credentials with GAM-specific properties and methods.""" """Google OAuth2.0 Credentials with GAM-specific properties and methods."""
DATETIME_FORMAT = '%Y-%m-%dT%H:%M:%SZ' DATETIME_FORMAT = '%Y-%m-%dT%H:%M:%SZ'
def __init__(self, def __init__(self,
token, token,
refresh_token=None, refresh_token=None,
id_token=None, id_token=None,
token_uri=None, token_uri=None,
client_id=None, client_id=None,
client_secret=None, client_secret=None,
scopes=None, scopes=None,
quota_project_id=None, quota_project_id=None,
expiry=None, expiry=None,
id_token_data=None, id_token_data=None,
filename=None): filename=None):
"""A thread-safe OAuth2.0 credentials object. """A thread-safe OAuth2.0 credentials object.
Credentials adds additional utility properties and methods to a Credentials adds additional utility properties and methods to a
standard OAuth2.0 credentials object. When used to store credentials on standard OAuth2.0 credentials object. When used to store credentials on
@ -115,43 +115,42 @@ class Credentials(google.oauth2.credentials.Credentials):
Raises: Raises:
TypeError: If id_token_data is not the required dict type. TypeError: If id_token_data is not the required dict type.
""" """
super(Credentials, self).__init__( super(Credentials, self).__init__(token=token,
token=token, refresh_token=refresh_token,
refresh_token=refresh_token, id_token=id_token,
id_token=id_token, token_uri=token_uri,
token_uri=token_uri, client_id=client_id,
client_id=client_id, client_secret=client_secret,
client_secret=client_secret, scopes=scopes,
scopes=scopes, quota_project_id=quota_project_id)
quota_project_id=quota_project_id)
# Load data not restored by the super class # Load data not restored by the super class
self.expiry = expiry self.expiry = expiry
if id_token_data and not isinstance(id_token_data, dict): if id_token_data and not isinstance(id_token_data, dict):
raise TypeError(f'Expected type id_token_data dict but received ' raise TypeError(f'Expected type id_token_data dict but received '
f'{type(id_token_data)}') f'{type(id_token_data)}')
self._id_token_data = id_token_data.copy() if id_token_data else None self._id_token_data = id_token_data.copy() if id_token_data else None
# If a filename is provided, use a lock file to control concurrent access # If a filename is provided, use a lock file to control concurrent access
# to the resource. If no filename is provided, use a thread lock that has # to the resource. If no filename is provided, use a thread lock that has
# the same interface as FileLock in order to simplify the implementation. # the same interface as FileLock in order to simplify the implementation.
if filename: if filename:
# Convert relative paths into absolute # Convert relative paths into absolute
self._filename = os.path.abspath(filename) self._filename = os.path.abspath(filename)
lock_file = os.path.abspath(f'{self._filename}.lock') lock_file = os.path.abspath(f'{self._filename}.lock')
self._lock = FileLock(lock_file) self._lock = FileLock(lock_file)
else: else:
self._filename = None self._filename = None
self._lock = _FileLikeThreadLock() self._lock = _FileLikeThreadLock()
# Use a property to prevent external mutation of the filename. # Use a property to prevent external mutation of the filename.
@property @property
def filename(self): def filename(self):
return self._filename return self._filename
@classmethod @classmethod
def from_authorized_user_info(cls, info, filename=None): def from_authorized_user_info(cls, info, filename=None):
"""Generates Credentials from JSON containing authorized user info. """Generates Credentials from JSON containing authorized user info.
Args: Args:
info: Dict, authorized user info in Google format. info: Dict, authorized user info in Google format.
@ -161,56 +160,56 @@ class Credentials(google.oauth2.credentials.Credentials):
Raises: Raises:
ValueError: If missing fields are detected in the info. ValueError: If missing fields are detected in the info.
""" """
# We need all of these keys # We need all of these keys
keys_needed = set(('client_id', 'client_secret')) keys_needed = set(('client_id', 'client_secret'))
# We need 1 or more of these keys # We need 1 or more of these keys
keys_need_one_of = set(('refresh_token', 'auth_token', 'token')) keys_need_one_of = set(('refresh_token', 'auth_token', 'token'))
missing = keys_needed.difference(info.keys()) missing = keys_needed.difference(info.keys())
has_one_of = set(info) & keys_need_one_of has_one_of = set(info) & keys_need_one_of
if missing or not has_one_of: if missing or not has_one_of:
raise ValueError( raise ValueError(
'Authorized user info was not in the expected format, missing ' 'Authorized user info was not in the expected format, missing '
f'fields {", ".join(missing)} and one of ' f'fields {", ".join(missing)} and one of '
f'{", ".join(keys_need_one_of)}.') f'{", ".join(keys_need_one_of)}.')
expiry = info.get('token_expiry') expiry = info.get('token_expiry')
if expiry: if expiry:
# Convert the raw expiry to datetime # Convert the raw expiry to datetime
expiry = datetime.datetime.strptime(expiry, Credentials.DATETIME_FORMAT) expiry = datetime.datetime.strptime(expiry,
id_token_data = info.get('decoded_id_token') Credentials.DATETIME_FORMAT)
id_token_data = info.get('decoded_id_token')
# Provide backwards compatibility with field names when loading from JSON. # Provide backwards compatibility with field names when loading from JSON.
# Some field names may be different, depending on when/how the credentials # Some field names may be different, depending on when/how the credentials
# were pickled. # were pickled.
return cls( return cls(token=info.get('token', info.get('auth_token', '')),
token=info.get('token', info.get('auth_token', '')), refresh_token=info.get('refresh_token', ''),
refresh_token=info.get('refresh_token', ''), id_token=info.get('id_token_jwt', info.get('id_token')),
id_token=info.get('id_token_jwt', info.get('id_token')), token_uri=info.get('token_uri'),
token_uri=info.get('token_uri'), client_id=info['client_id'],
client_id=info['client_id'], client_secret=info['client_secret'],
client_secret=info['client_secret'], scopes=info.get('scopes'),
scopes=info.get('scopes'), quota_project_id=info.get('quota_project_id'),
quota_project_id=info.get('quota_project_id'), expiry=expiry,
expiry=expiry, id_token_data=id_token_data,
id_token_data=id_token_data, filename=filename)
filename=filename)
@classmethod @classmethod
def from_google_oauth2_credentials(cls, credentials, filename=None): def from_google_oauth2_credentials(cls, credentials, filename=None):
"""Generates Credentials from a google.oauth2.Credentials object.""" """Generates Credentials from a google.oauth2.Credentials object."""
info = json.loads(credentials.to_json()) info = json.loads(credentials.to_json())
# Add properties which are not exported with the native to_json() output. # Add properties which are not exported with the native to_json() output.
info['id_token'] = credentials.id_token info['id_token'] = credentials.id_token
if credentials.expiry: if credentials.expiry:
info['token_expiry'] = credentials.expiry.strftime( info['token_expiry'] = credentials.expiry.strftime(
Credentials.DATETIME_FORMAT) Credentials.DATETIME_FORMAT)
info['quota_project_id'] = credentials.quota_project_id info['quota_project_id'] = credentials.quota_project_id
return cls.from_authorized_user_info(info, filename=filename) return cls.from_authorized_user_info(info, filename=filename)
@classmethod @classmethod
def from_credentials_file(cls, filename): def from_credentials_file(cls, filename):
"""Generates Credentials from a stored Credentials file. """Generates Credentials from a stored Credentials file.
The same file will be used to save the credentials when the access token is The same file will be used to save the credentials when the access token is
refreshed. refreshed.
@ -226,32 +225,34 @@ class Credentials(google.oauth2.credentials.Credentials):
InvalidCredentialsFileError: When the credentials file cannot be opened. InvalidCredentialsFileError: When the credentials file cannot be opened.
EmptyCredentialsFileError: When the provided file contains no credentials. EmptyCredentialsFileError: When the provided file contains no credentials.
""" """
file_content = fileutils.read_file( file_content = fileutils.read_file(filename,
filename, continue_on_error=True, display_errors=False) continue_on_error=True,
if file_content is None: display_errors=False)
raise InvalidCredentialsFileError(f'File {filename} could not be opened') if file_content is None:
info = json.loads(file_content) raise InvalidCredentialsFileError(
if not info: f'File {filename} could not be opened')
raise EmptyCredentialsFileError( info = json.loads(file_content)
f'File {filename} contains no credential data') if not info:
raise EmptyCredentialsFileError(
f'File {filename} contains no credential data')
try: try:
# We read the existing data from the passed in file, but we also want to # We read the existing data from the passed in file, but we also want to
# save future data/tokens in the same place. # save future data/tokens in the same place.
return cls.from_authorized_user_info(info, filename=filename) return cls.from_authorized_user_info(info, filename=filename)
except ValueError as e: except ValueError as e:
raise InvalidCredentialsFileError(str(e)) raise InvalidCredentialsFileError(str(e))
@classmethod @classmethod
def from_client_secrets(cls, def from_client_secrets(cls,
client_id, client_id,
client_secret, client_secret,
scopes, scopes,
access_type='offline', access_type='offline',
login_hint=None, login_hint=None,
filename=None, filename=None,
use_console_flow=False): use_console_flow=False):
"""Runs an OAuth Flow from client secrets to generate credentials. """Runs an OAuth Flow from client secrets to generate credentials.
Args: Args:
client_id: String, The OAuth2.0 Client ID. client_id: String, The OAuth2.0 Client ID.
@ -275,46 +276,50 @@ class Credentials(google.oauth2.credentials.Credentials):
Returns: Returns:
Credentials Credentials
""" """
client_config = { client_config = {
'installed': { 'installed': {
'client_id': client_id, 'client_id': client_id,
'client_secret': client_secret, 'client_secret': client_secret,
'redirect_uris': ['http://localhost', 'urn:ietf:wg:oauth:2.0:oob'], 'redirect_uris': [
'auth_uri': 'https://accounts.google.com/o/oauth2/v2/auth', 'http://localhost', 'urn:ietf:wg:oauth:2.0:oob'
'token_uri': 'https://oauth2.googleapis.com/token', ],
'auth_uri': 'https://accounts.google.com/o/oauth2/v2/auth',
'token_uri': 'https://oauth2.googleapis.com/token',
}
} }
}
flow = _ShortURLFlow.from_client_config( flow = _ShortURLFlow.from_client_config(client_config,
client_config, scopes, autogenerate_code_verifier=True) scopes,
flow_kwargs = {'access_type': access_type} autogenerate_code_verifier=True)
if login_hint: flow_kwargs = {'access_type': access_type}
flow_kwargs['login_hint'] = login_hint if login_hint:
flow_kwargs['login_hint'] = login_hint
# TODO: Move code for browser detection somewhere in this file so that the # TODO: Move code for browser detection somewhere in this file so that the
# messaging about `nobrowser.txt` is co-located with the logic that uses it. # messaging about `nobrowser.txt` is co-located with the logic that uses it.
if use_console_flow: if use_console_flow:
flow.run_console( flow.run_console(
authorization_prompt_message=MESSAGE_CONSOLE_AUTHORIZATION_PROMPT, authorization_prompt_message=
authorization_code_message=MESSAGE_CONSOLE_AUTHORIZATION_CODE, MESSAGE_CONSOLE_AUTHORIZATION_PROMPT,
**flow_kwargs) authorization_code_message=MESSAGE_CONSOLE_AUTHORIZATION_CODE,
else: **flow_kwargs)
flow.run_local_server( else:
authorization_prompt_message=MESSAGE_LOCAL_SERVER_AUTHORIZATION_PROMPT, flow.run_local_server(authorization_prompt_message=
success_message=MESSAGE_LOCAL_SERVER_SUCCESS, MESSAGE_LOCAL_SERVER_AUTHORIZATION_PROMPT,
**flow_kwargs) success_message=MESSAGE_LOCAL_SERVER_SUCCESS,
return cls.from_google_oauth2_credentials( **flow_kwargs)
flow.credentials, filename=filename) return cls.from_google_oauth2_credentials(flow.credentials,
filename=filename)
@classmethod @classmethod
def from_client_secrets_file(cls, def from_client_secrets_file(cls,
client_secrets_file, client_secrets_file,
scopes, scopes,
access_type='offline', access_type='offline',
login_hint=None, login_hint=None,
credentials_file=None, credentials_file=None,
use_console_flow=False): use_console_flow=False):
"""Runs an OAuth Flow from secrets stored on disk to generate credentials. """Runs an OAuth Flow from secrets stored on disk to generate credentials.
Args: Args:
client_secrets_file: String, path to a file containing a client ID and client_secrets_file: String, path to a file containing a client ID and
@ -345,53 +350,55 @@ class Credentials(google.oauth2.credentials.Credentials):
Returns: Returns:
Credentials Credentials
""" """
cs_data = fileutils.read_file( cs_data = fileutils.read_file(client_secrets_file,
client_secrets_file, continue_on_error=True, display_errors=False) continue_on_error=True,
if not cs_data: display_errors=False)
raise InvalidClientSecretsFileError( if not cs_data:
f'File {client_secrets_file} could not be opened') raise InvalidClientSecretsFileError(
try: f'File {client_secrets_file} could not be opened')
cs_json = json.loads(cs_data) try:
client_id = cs_json['installed']['client_id'] cs_json = json.loads(cs_data)
# Chop off .apps.googleusercontent.com suffix as it's not needed client_id = cs_json['installed']['client_id']
# and we need to keep things short for the Auth URL. # Chop off .apps.googleusercontent.com suffix as it's not needed
client_id = re.sub(r'\.apps\.googleusercontent\.com$', '', client_id) # and we need to keep things short for the Auth URL.
client_secret = cs_json['installed']['client_secret'] client_id = re.sub(r'\.apps\.googleusercontent\.com$', '',
except (ValueError, IndexError, KeyError): client_id)
raise InvalidClientSecretsFileFormatError( client_secret = cs_json['installed']['client_secret']
f'Could not extract Client ID or Client Secret from file {client_secrets_file}' except (ValueError, IndexError, KeyError):
) raise InvalidClientSecretsFileFormatError(
f'Could not extract Client ID or Client Secret from file {client_secrets_file}'
)
return cls.from_client_secrets( return cls.from_client_secrets(client_id,
client_id, client_secret,
client_secret, scopes,
scopes, access_type=access_type,
access_type=access_type, login_hint=login_hint,
login_hint=login_hint, filename=credentials_file,
filename=credentials_file, use_console_flow=use_console_flow)
use_console_flow=use_console_flow)
def _fetch_id_token_data(self): def _fetch_id_token_data(self):
"""Fetches verification details from Google for the OAuth2.0 token. """Fetches verification details from Google for the OAuth2.0 token.
See more: https://developers.google.com/identity/sign-in/web/backend-auth See more: https://developers.google.com/identity/sign-in/web/backend-auth
Raises: Raises:
CredentialsError: If no id_token is present. CredentialsError: If no id_token is present.
""" """
if not self.id_token: if not self.id_token:
raise CredentialsError('Failed to fetch token data. No id_token present.') raise CredentialsError(
'Failed to fetch token data. No id_token present.')
request = transport.create_request() request = transport.create_request()
if self.expired: if self.expired:
# The id_token needs to be unexpired, in order to request data about it. # The id_token needs to be unexpired, in order to request data about it.
self.refresh(request) self.refresh(request)
self._id_token_data = google.oauth2.id_token.verify_oauth2_token( self._id_token_data = google.oauth2.id_token.verify_oauth2_token(
self.id_token, request) self.id_token, request)
def get_token_value(self, field): def get_token_value(self, field):
"""Retrieves data from the OAuth ID token. """Retrieves data from the OAuth ID token.
See more: https://developers.google.com/identity/sign-in/web/backend-auth See more: https://developers.google.com/identity/sign-in/web/backend-auth
@ -402,14 +409,14 @@ class Credentials(google.oauth2.credentials.Credentials):
The value associated with the given key or 'Unknown' if the key data can The value associated with the given key or 'Unknown' if the key data can
not be found in the access token data. not be found in the access token data.
""" """
if not self._id_token_data: if not self._id_token_data:
self._fetch_id_token_data() self._fetch_id_token_data()
# Maintain legacy GAM behavior here to return "Unknown" if the field is # Maintain legacy GAM behavior here to return "Unknown" if the field is
# otherwise unpopulated. # otherwise unpopulated.
return self._id_token_data.get(field, 'Unknown') return self._id_token_data.get(field, 'Unknown')
def to_json(self, strip=None): def to_json(self, strip=None):
"""Creates a JSON representation of a Credentials. """Creates a JSON representation of a Credentials.
Args: Args:
strip: Sequence[str], Optional list of members to exclude from the strip: Sequence[str], Optional list of members to exclude from the
@ -419,32 +426,32 @@ class Credentials(google.oauth2.credentials.Credentials):
str: A JSON representation of this instance, suitable to pass to str: A JSON representation of this instance, suitable to pass to
from_json(). from_json().
""" """
expiry = self.expiry.strftime( expiry = self.expiry.strftime(
Credentials.DATETIME_FORMAT) if self.expiry else None Credentials.DATETIME_FORMAT) if self.expiry else None
prep = { prep = {
'token': self.token, 'token': self.token,
'refresh_token': self.refresh_token, 'refresh_token': self.refresh_token,
'token_uri': self.token_uri, 'token_uri': self.token_uri,
'client_id': self.client_id, 'client_id': self.client_id,
'client_secret': self.client_secret, 'client_secret': self.client_secret,
'id_token': self.id_token, 'id_token': self.id_token,
# Google auth doesn't currently give us scopes back on refresh. # Google auth doesn't currently give us scopes back on refresh.
# 'scopes': sorted(self.scopes), # 'scopes': sorted(self.scopes),
'token_expiry': expiry, 'token_expiry': expiry,
'decoded_id_token': self._id_token_data, 'decoded_id_token': self._id_token_data,
} }
# Remove empty entries # Remove empty entries
prep = {k: v for k, v in prep.items() if v is not None} prep = {k: v for k, v in prep.items() if v is not None}
# Remove entries that explicitly need to be removed # Remove entries that explicitly need to be removed
if strip is not None: if strip is not None:
prep = {k: v for k, v in prep.items() if k not in strip} prep = {k: v for k, v in prep.items() if k not in strip}
return json.dumps(prep, indent=2, sort_keys=True) return json.dumps(prep, indent=2, sort_keys=True)
def refresh(self, request=None): def refresh(self, request=None):
"""Refreshes the credential's access token. """Refreshes the credential's access token.
Args: Args:
request: google.auth.transport.Request, The object used to make HTTP request: google.auth.transport.Request, The object used to make HTTP
@ -454,99 +461,100 @@ class Credentials(google.oauth2.credentials.Credentials):
google.auth.exceptions.RefreshError: If the credentials could not be google.auth.exceptions.RefreshError: If the credentials could not be
refreshed. refreshed.
""" """
with self._lock: with self._lock:
if request is None: if request is None:
request = transport.create_request() request = transport.create_request()
self._locked_refresh(request) self._locked_refresh(request)
# Save the new tokens back to disk, if these credentials are disk-backed. # Save the new tokens back to disk, if these credentials are disk-backed.
if self._filename: if self._filename:
self._locked_write() self._locked_write()
def _locked_refresh(self, request): def _locked_refresh(self, request):
"""Refreshes the credential's access token while the file lock is held.""" """Refreshes the credential's access token while the file lock is held."""
assert self._lock.is_locked assert self._lock.is_locked
super(Credentials, self).refresh(request) super(Credentials, self).refresh(request)
def write(self): def write(self):
"""Writes credentials to disk.""" """Writes credentials to disk."""
with self._lock: with self._lock:
self._locked_write() self._locked_write()
def _locked_write(self): def _locked_write(self):
"""Writes credentials to disk while the file lock is held.""" """Writes credentials to disk while the file lock is held."""
assert self._lock.is_locked assert self._lock.is_locked
if not self.filename: if not self.filename:
# If no filename was provided to the constructor, these credentials cannot # If no filename was provided to the constructor, these credentials cannot
# be saved to disk. # be saved to disk.
raise CredentialsError( raise CredentialsError(
'The credentials have no associated filename and cannot be saved ' 'The credentials have no associated filename and cannot be saved '
'to disk.') 'to disk.')
fileutils.write_file(self._filename, self.to_json()) fileutils.write_file(self._filename, self.to_json())
def delete(self): def delete(self):
"""Deletes all files on disk related to these credentials.""" """Deletes all files on disk related to these credentials."""
with self._lock: with self._lock:
# Only attempt to remove the file if the lock we're using is a FileLock. # Only attempt to remove the file if the lock we're using is a FileLock.
if isinstance(self._lock, FileLock): if isinstance(self._lock, FileLock):
os.remove(self._filename) os.remove(self._filename)
if self._lock.lock_file and not GM_Globals[GM_WINDOWS]: if self._lock.lock_file and not GM_Globals[GM_WINDOWS]:
os.remove(self._lock.lock_file) os.remove(self._lock.lock_file)
_REVOKE_TOKEN_BASE_URI = 'https://accounts.google.com/o/oauth2/revoke' _REVOKE_TOKEN_BASE_URI = 'https://accounts.google.com/o/oauth2/revoke'
def revoke(self, http=None): def revoke(self, http=None):
"""Revokes this credential's access token with the server. """Revokes this credential's access token with the server.
Args: Args:
http: httplib2.Http compatible object for use as a transport. If no http http: httplib2.Http compatible object for use as a transport. If no http
is provided, a default will be used. is provided, a default will be used.
""" """
with self._lock: with self._lock:
if http is None: if http is None:
http = transport.create_http() http = transport.create_http()
params = urlencode({'token': self.refresh_token}) params = urlencode({'token': self.refresh_token})
revoke_uri = f'{Credentials._REVOKE_TOKEN_BASE_URI}?{params}' revoke_uri = f'{Credentials._REVOKE_TOKEN_BASE_URI}?{params}'
http.request(revoke_uri, 'GET') http.request(revoke_uri, 'GET')
class _ShortURLFlow(google_auth_oauthlib.flow.InstalledAppFlow): class _ShortURLFlow(google_auth_oauthlib.flow.InstalledAppFlow):
"""InstalledAppFlow which utilizes a URL shortener for authorization URLs.""" """InstalledAppFlow which utilizes a URL shortener for authorization URLs."""
URL_SHORTENER_ENDPOINT = 'https://gam-shortn.appspot.com/create' URL_SHORTENER_ENDPOINT = 'https://gam-shortn.appspot.com/create'
def authorization_url(self, http=None, **kwargs):
"""Gets a shortened authorization URL."""
long_url, state = super(_ShortURLFlow, self).authorization_url(**kwargs)
short_url = utils.shorten_url(long_url)
return short_url, state
def authorization_url(self, http=None, **kwargs):
"""Gets a shortened authorization URL."""
long_url, state = super(_ShortURLFlow, self).authorization_url(**kwargs)
short_url = utils.shorten_url(long_url)
return short_url, state
class _FileLikeThreadLock(object): class _FileLikeThreadLock(object):
"""A threading.lock which has the same interface as filelock.Filelock.""" """A threading.lock which has the same interface as filelock.Filelock."""
def __init__(self): def __init__(self):
"""A shell object that holds a threading.Lock. """A shell object that holds a threading.Lock.
Since we cannot inherit from built-in classes such as threading.Lock, we Since we cannot inherit from built-in classes such as threading.Lock, we
just use a shell object and maintain a lock inside of it. just use a shell object and maintain a lock inside of it.
""" """
self._lock = threading.Lock() self._lock = threading.Lock()
def __enter__(self, *args, **kwargs): def __enter__(self, *args, **kwargs):
return self._lock.__enter__(*args, **kwargs) return self._lock.__enter__(*args, **kwargs)
def __exit__(self, *args, **kwargs): def __exit__(self, *args, **kwargs):
return self._lock.__exit__(*args, **kwargs) return self._lock.__exit__(*args, **kwargs)
def acquire(self, **kwargs): def acquire(self, **kwargs):
return self._lock.acquire(**kwargs) return self._lock.acquire(**kwargs)
def release(self): def release(self):
return self._lock.release() return self._lock.release()
@property @property
def is_locked(self): def is_locked(self):
return self._lock.locked() return self._lock.locked()
@property @property
def lock_file(self): def lock_file(self):
return None return None

File diff suppressed because it is too large Load Diff

View File

@ -9,72 +9,72 @@ from gam.var import MESSAGE_INVALID_JSON
def system_error_exit(return_code, message): def system_error_exit(return_code, message):
"""Raises a system exit with the given return code and message. """Raises a system exit with the given return code and message.
Args: Args:
return_code: Int, the return code to yield when the system exits. return_code: Int, the return code to yield when the system exits.
message: An error message to print before the system exits. message: An error message to print before the system exits.
""" """
if message: if message:
display.print_error(message) display.print_error(message)
sys.exit(return_code) sys.exit(return_code)
def invalid_argument_exit(argument, command): def invalid_argument_exit(argument, command):
"""Indicate that the argument is not valid for the command. """Indicate that the argument is not valid for the command.
Args: Args:
argument: the invalid argument argument: the invalid argument
command: the base GAM command command: the base GAM command
""" """
system_error_exit(2, f'{argument} is not a valid argument for "{command}"') system_error_exit(2, f'{argument} is not a valid argument for "{command}"')
def missing_argument_exit(argument, command): def missing_argument_exit(argument, command):
"""Indicate that the argument is missing for the command. """Indicate that the argument is missing for the command.
Args: Args:
argument: the missingagrument argument: the missingagrument
command: the base GAM command command: the base GAM command
""" """
system_error_exit(2, f'missing argument {argument} for "{command}"') system_error_exit(2, f'missing argument {argument} for "{command}"')
def expected_argument_exit(name, expected, argument): def expected_argument_exit(name, expected, argument):
"""Indicate that the argument does not have an expected value for the command. """Indicate that the argument does not have an expected value for the command.
Args: Args:
name: the field name name: the field name
expected: the expected values expected: the expected values
argument: the invalid argument argument: the invalid argument
""" """
system_error_exit(2, f'{name} must be one of {expected}; got {argument}') system_error_exit(2, f'{name} must be one of {expected}; got {argument}')
def csv_field_error_exit(field_name, field_names): def csv_field_error_exit(field_name, field_names):
"""Raises a system exit when a CSV field is malformed. """Raises a system exit when a CSV field is malformed.
Args: Args:
field_name: The CSV field name for which a header does not exist in the field_name: The CSV field name for which a header does not exist in the
existing CSV headers. existing CSV headers.
field_names: The known list of CSV headers. field_names: The known list of CSV headers.
""" """
system_error_exit( system_error_exit(
2, 2,
MESSAGE_HEADER_NOT_FOUND_IN_CSV_HEADERS.format(field_name, MESSAGE_HEADER_NOT_FOUND_IN_CSV_HEADERS.format(field_name,
','.join(field_names))) ','.join(field_names)))
def invalid_json_exit(file_name): def invalid_json_exit(file_name):
"""Raises a sysyem exit when invalid JSON content is encountered.""" """Raises a sysyem exit when invalid JSON content is encountered."""
system_error_exit(17, MESSAGE_INVALID_JSON.format(file_name)) system_error_exit(17, MESSAGE_INVALID_JSON.format(file_name))
def wait_on_failure(current_attempt_num, def wait_on_failure(current_attempt_num,
total_num_retries, total_num_retries,
error_message, error_message,
error_print_threshold=3): error_print_threshold=3):
"""Executes an exponential backoff-style system sleep. """Executes an exponential backoff-style system sleep.
Args: Args:
current_attempt_num: Int, the current number of retries. current_attempt_num: Int, the current number of retries.
@ -86,11 +86,11 @@ def wait_on_failure(current_attempt_num,
error messages suppressed. Any current_attempt_num greater than error messages suppressed. Any current_attempt_num greater than
error_print_threshold will print the prescribed error. error_print_threshold will print the prescribed error.
""" """
wait_on_fail = min(2**current_attempt_num, wait_on_fail = min(2**current_attempt_num,
60) + float(random.randint(1, 1000)) / 1000 60) + float(random.randint(1, 1000)) / 1000
if current_attempt_num > error_print_threshold: if current_attempt_num > error_print_threshold:
sys.stderr.write((f'Temporary error: {error_message}, Backing off: ' sys.stderr.write((f'Temporary error: {error_message}, Backing off: '
f'{int(wait_on_fail)} seconds, Retry: ' f'{int(wait_on_fail)} seconds, Retry: '
f'{current_attempt_num}/{total_num_retries}\n')) f'{current_attempt_num}/{total_num_retries}\n'))
sys.stderr.flush() sys.stderr.flush()
time.sleep(wait_on_fail) time.sleep(wait_on_fail)

View File

@ -8,99 +8,101 @@ from gam import controlflow
class ControlFlowTest(unittest.TestCase): class ControlFlowTest(unittest.TestCase):
def test_system_error_exit_raises_systemexit_error(self): def test_system_error_exit_raises_systemexit_error(self):
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
controlflow.system_error_exit(1, 'exit message') controlflow.system_error_exit(1, 'exit message')
def test_system_error_exit_raises_systemexit_with_return_code(self): def test_system_error_exit_raises_systemexit_with_return_code(self):
with self.assertRaises(SystemExit) as context_manager: with self.assertRaises(SystemExit) as context_manager:
controlflow.system_error_exit(100, 'exit message') controlflow.system_error_exit(100, 'exit message')
self.assertEqual(context_manager.exception.code, 100) self.assertEqual(context_manager.exception.code, 100)
@patch.object(controlflow.display, 'print_error') @patch.object(controlflow.display, 'print_error')
def test_system_error_exit_prints_error_before_exiting(self, mock_print_err): def test_system_error_exit_prints_error_before_exiting(
with self.assertRaises(SystemExit): self, mock_print_err):
controlflow.system_error_exit(100, 'exit message') with self.assertRaises(SystemExit):
self.assertIn('exit message', mock_print_err.call_args[0][0]) controlflow.system_error_exit(100, 'exit message')
self.assertIn('exit message', mock_print_err.call_args[0][0])
def test_csv_field_error_exit_raises_systemexit_error(self): def test_csv_field_error_exit_raises_systemexit_error(self):
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
controlflow.csv_field_error_exit('aField', controlflow.csv_field_error_exit('aField',
['unusedField1', 'unusedField2']) ['unusedField1', 'unusedField2'])
def test_csv_field_error_exit_exits_code_2(self): def test_csv_field_error_exit_exits_code_2(self):
with self.assertRaises(SystemExit) as context_manager: with self.assertRaises(SystemExit) as context_manager:
controlflow.csv_field_error_exit('aField', controlflow.csv_field_error_exit('aField',
['unusedField1', 'unusedField2']) ['unusedField1', 'unusedField2'])
self.assertEqual(context_manager.exception.code, 2) self.assertEqual(context_manager.exception.code, 2)
@patch.object(controlflow.display, 'print_error') @patch.object(controlflow.display, 'print_error')
def test_csv_field_error_exit_prints_error_details(self, mock_print_err): def test_csv_field_error_exit_prints_error_details(self, mock_print_err):
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
controlflow.csv_field_error_exit('aField', controlflow.csv_field_error_exit('aField',
['unusedField1', 'unusedField2']) ['unusedField1', 'unusedField2'])
printed_message = mock_print_err.call_args[0][0] printed_message = mock_print_err.call_args[0][0]
self.assertIn('aField', printed_message) self.assertIn('aField', printed_message)
self.assertIn('unusedField1', printed_message) self.assertIn('unusedField1', printed_message)
self.assertIn('unusedField2', printed_message) self.assertIn('unusedField2', printed_message)
def test_invalid_json_exit_raises_systemexit_error(self): def test_invalid_json_exit_raises_systemexit_error(self):
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
controlflow.invalid_json_exit('filename') controlflow.invalid_json_exit('filename')
def test_invalid_json_exit_exit_exits_code_17(self): def test_invalid_json_exit_exit_exits_code_17(self):
with self.assertRaises(SystemExit) as context_manager: with self.assertRaises(SystemExit) as context_manager:
controlflow.invalid_json_exit('filename') controlflow.invalid_json_exit('filename')
self.assertEqual(context_manager.exception.code, 17) self.assertEqual(context_manager.exception.code, 17)
@patch.object(controlflow.display, 'print_error') @patch.object(controlflow.display, 'print_error')
def test_invalid_json_exit_prints_error_details(self, mock_print_err): def test_invalid_json_exit_prints_error_details(self, mock_print_err):
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
controlflow.invalid_json_exit('filename') controlflow.invalid_json_exit('filename')
printed_message = mock_print_err.call_args[0][0] printed_message = mock_print_err.call_args[0][0]
self.assertIn('filename', printed_message) self.assertIn('filename', printed_message)
@patch.object(controlflow.time, 'sleep') @patch.object(controlflow.time, 'sleep')
def test_wait_on_failure_waits_exponentially(self, mock_sleep): def test_wait_on_failure_waits_exponentially(self, mock_sleep):
controlflow.wait_on_failure(1, 5, 'Backoff attempt #1') controlflow.wait_on_failure(1, 5, 'Backoff attempt #1')
controlflow.wait_on_failure(2, 5, 'Backoff attempt #2') controlflow.wait_on_failure(2, 5, 'Backoff attempt #2')
controlflow.wait_on_failure(3, 5, 'Backoff attempt #3') controlflow.wait_on_failure(3, 5, 'Backoff attempt #3')
sleep_calls = mock_sleep.call_args_list sleep_calls = mock_sleep.call_args_list
self.assertGreaterEqual(sleep_calls[0][0][0], 2**1) self.assertGreaterEqual(sleep_calls[0][0][0], 2**1)
self.assertGreaterEqual(sleep_calls[1][0][0], 2**2) self.assertGreaterEqual(sleep_calls[1][0][0], 2**2)
self.assertGreaterEqual(sleep_calls[2][0][0], 2**3) self.assertGreaterEqual(sleep_calls[2][0][0], 2**3)
@patch.object(controlflow.time, 'sleep') @patch.object(controlflow.time, 'sleep')
def test_wait_on_failure_does_not_exceed_60_secs_wait(self, mock_sleep): def test_wait_on_failure_does_not_exceed_60_secs_wait(self, mock_sleep):
total_attempts = 20 total_attempts = 20
for attempt in range(1, total_attempts + 1): for attempt in range(1, total_attempts + 1):
controlflow.wait_on_failure( controlflow.wait_on_failure(
attempt, attempt,
total_attempts, total_attempts,
'Attempt #%s' % attempt, 'Attempt #%s' % attempt,
# Suppress messages while we make a lot of attempts. # Suppress messages while we make a lot of attempts.
error_print_threshold=total_attempts + 1) error_print_threshold=total_attempts + 1)
# Wait time may be between 60 and 61 secs, due to rand addition. # Wait time may be between 60 and 61 secs, due to rand addition.
self.assertLessEqual(mock_sleep.call_args[0][0], 61) self.assertLessEqual(mock_sleep.call_args[0][0], 61)
# Prevent the system from actually sleeping and thus slowing down the test. # Prevent the system from actually sleeping and thus slowing down the test.
@patch.object(controlflow.time, 'sleep') @patch.object(controlflow.time, 'sleep')
def test_wait_on_failure_prints_errors(self, unused_mock_sleep): def test_wait_on_failure_prints_errors(self, unused_mock_sleep):
message = 'An error message to display' message = 'An error message to display'
with patch.object(controlflow.sys.stderr, 'write') as mock_stderr_write: with patch.object(controlflow.sys.stderr, 'write') as mock_stderr_write:
controlflow.wait_on_failure(1, 5, message, error_print_threshold=0) controlflow.wait_on_failure(1, 5, message, error_print_threshold=0)
self.assertIn(message, mock_stderr_write.call_args[0][0]) self.assertIn(message, mock_stderr_write.call_args[0][0])
@patch.object(controlflow.time, 'sleep') @patch.object(controlflow.time, 'sleep')
def test_wait_on_failure_only_prints_after_threshold(self, unused_mock_sleep): def test_wait_on_failure_only_prints_after_threshold(
total_attempts = 5 self, unused_mock_sleep):
threshold = 3 total_attempts = 5
with patch.object(controlflow.sys.stderr, 'write') as mock_stderr_write: threshold = 3
for attempt in range(1, total_attempts + 1): with patch.object(controlflow.sys.stderr, 'write') as mock_stderr_write:
controlflow.wait_on_failure( for attempt in range(1, total_attempts + 1):
attempt, controlflow.wait_on_failure(attempt,
total_attempts, total_attempts,
'Attempt #%s' % attempt, 'Attempt #%s' % attempt,
error_print_threshold=threshold) error_print_threshold=threshold)
self.assertEqual(total_attempts - threshold, mock_stderr_write.call_count) self.assertEqual(total_attempts - threshold,
mock_stderr_write.call_count)

View File

@ -17,29 +17,35 @@ from gam import gapi
def current_count(i, count): def current_count(i, count):
return f' ({i}/{count})' if (count > GC_Values[GC_SHOW_COUNTS_MIN]) else '' return f' ({i}/{count})' if (count > GC_Values[GC_SHOW_COUNTS_MIN]) else ''
def current_count_nl(i, count): def current_count_nl(i, count):
return f' ({i}/{count})\n' if (count > GC_Values[GC_SHOW_COUNTS_MIN]) else '\n' return f' ({i}/{count})\n' if (
count > GC_Values[GC_SHOW_COUNTS_MIN]) else '\n'
def add_field_to_fields_list(fieldName, fieldsChoiceMap, fieldsList): def add_field_to_fields_list(fieldName, fieldsChoiceMap, fieldsList):
fields = fieldsChoiceMap[fieldName.lower()] fields = fieldsChoiceMap[fieldName.lower()]
if isinstance(fields, list): if isinstance(fields, list):
fieldsList.extend(fields) fieldsList.extend(fields)
else: else:
fieldsList.append(fields) fieldsList.append(fields)
# Write a CSV file # Write a CSV file
def add_titles_to_csv_file(addTitles, titles): def add_titles_to_csv_file(addTitles, titles):
for title in addTitles: for title in addTitles:
if title not in titles: if title not in titles:
titles.append(title) titles.append(title)
def add_row_titles_to_csv_file(row, csvRows, titles): def add_row_titles_to_csv_file(row, csvRows, titles):
csvRows.append(row) csvRows.append(row)
for title in row: for title in row:
if title not in titles: if title not in titles:
titles.append(title) titles.append(title)
# fieldName is command line argument # fieldName is command line argument
# fieldNameMap maps fieldName to API field names; CSV file header will be API field name # fieldNameMap maps fieldName to API field names; CSV file header will be API field name
@ -49,12 +55,14 @@ def add_row_titles_to_csv_file(row, csvRows, titles):
# } # }
# fieldsList is the list of API fields # fieldsList is the list of API fields
# fieldsTitles maps the API field name to the CSV file header # fieldsTitles maps the API field name to the CSV file header
def add_field_to_csv_file(fieldName, fieldNameMap, fieldsList, fieldsTitles, titles): def add_field_to_csv_file(fieldName, fieldNameMap, fieldsList, fieldsTitles,
for ftList in fieldNameMap[fieldName]: titles):
if ftList not in fieldsTitles: for ftList in fieldNameMap[fieldName]:
fieldsList.append(ftList) if ftList not in fieldsTitles:
fieldsTitles[ftList] = ftList fieldsList.append(ftList)
add_titles_to_csv_file([ftList], titles) fieldsTitles[ftList] = ftList
add_titles_to_csv_file([ftList], titles)
# fieldName is command line argument # fieldName is command line argument
# fieldNameTitleMap maps fieldName to API field name and CSV file header # fieldNameTitleMap maps fieldName to API field name and CSV file header
@ -64,175 +72,226 @@ def add_field_to_csv_file(fieldName, fieldNameMap, fieldsList, fieldsTitles, tit
# } # }
# fieldsList is the list of API fields # fieldsList is the list of API fields
# fieldsTitles maps the API field name to the CSV file header # fieldsTitles maps the API field name to the CSV file header
def add_field_title_to_csv_file(fieldName, fieldNameTitleMap, fieldsList, fieldsTitles, titles): def add_field_title_to_csv_file(fieldName, fieldNameTitleMap, fieldsList,
ftList = fieldNameTitleMap[fieldName] fieldsTitles, titles):
for i in range(0, len(ftList), 2): ftList = fieldNameTitleMap[fieldName]
if ftList[i] not in fieldsTitles: for i in range(0, len(ftList), 2):
fieldsList.append(ftList[i]) if ftList[i] not in fieldsTitles:
fieldsTitles[ftList[i]] = ftList[i+1] fieldsList.append(ftList[i])
add_titles_to_csv_file([ftList[i+1]], titles) fieldsTitles[ftList[i]] = ftList[i + 1]
add_titles_to_csv_file([ftList[i + 1]], titles)
def sort_csv_titles(firstTitle, titles): def sort_csv_titles(firstTitle, titles):
restoreTitles = [] restoreTitles = []
for title in firstTitle: for title in firstTitle:
if title in titles: if title in titles:
titles.remove(title) titles.remove(title)
restoreTitles.append(title) restoreTitles.append(title)
titles.sort() titles.sort()
for title in restoreTitles[::-1]: for title in restoreTitles[::-1]:
titles.insert(0, title) titles.insert(0, title)
def QuotedArgumentList(items): def QuotedArgumentList(items):
return ' '.join([item if item and (item.find(' ') == -1) and (item.find(',') == -1) else '"'+item+'"' for item in items]) return ' '.join([
item if item and (item.find(' ') == -1) and
(item.find(',') == -1) else '"' + item + '"' for item in items
])
def write_csv_file(csvRows, titles, list_type, todrive): def write_csv_file(csvRows, titles, list_type, todrive):
def rowDateTimeFilterMatch(dateMode, rowDate, op, filterDate):
if not rowDate or not isinstance(rowDate, str):
return False
try:
rowTime = dateutil.parser.parse(rowDate, ignoretz=True)
if dateMode:
rowDate = datetime.datetime(rowTime.year, rowTime.month, rowTime.day).isoformat()+'Z'
except ValueError:
rowDate = NEVER_TIME
if op == '<':
return rowDate < filterDate
if op == '<=':
return rowDate <= filterDate
if op == '>':
return rowDate > filterDate
if op == '>=':
return rowDate >= filterDate
if op == '!=':
return rowDate != filterDate
return rowDate == filterDate
def rowCountFilterMatch(rowCount, op, filterCount): def rowDateTimeFilterMatch(dateMode, rowDate, op, filterDate):
if isinstance(rowCount, str): if not rowDate or not isinstance(rowDate, str):
if not rowCount.isdigit(): return False
try:
rowTime = dateutil.parser.parse(rowDate, ignoretz=True)
if dateMode:
rowDate = datetime.datetime(rowTime.year, rowTime.month,
rowTime.day).isoformat() + 'Z'
except ValueError:
rowDate = NEVER_TIME
if op == '<':
return rowDate < filterDate
if op == '<=':
return rowDate <= filterDate
if op == '>':
return rowDate > filterDate
if op == '>=':
return rowDate >= filterDate
if op == '!=':
return rowDate != filterDate
return rowDate == filterDate
def rowCountFilterMatch(rowCount, op, filterCount):
if isinstance(rowCount, str):
if not rowCount.isdigit():
return False
rowCount = int(rowCount)
elif not isinstance(rowCount, int):
return False
if op == '<':
return rowCount < filterCount
if op == '<=':
return rowCount <= filterCount
if op == '>':
return rowCount > filterCount
if op == '>=':
return rowCount >= filterCount
if op == '!=':
return rowCount != filterCount
return rowCount == filterCount
def rowBooleanFilterMatch(rowBoolean, filterBoolean):
if not isinstance(rowBoolean, bool):
return False
return rowBoolean == filterBoolean
def headerFilterMatch(filters, title):
for filterStr in filters:
if filterStr.match(title):
return True
return False return False
rowCount = int(rowCount)
elif not isinstance(rowCount, int):
return False
if op == '<':
return rowCount < filterCount
if op == '<=':
return rowCount <= filterCount
if op == '>':
return rowCount > filterCount
if op == '>=':
return rowCount >= filterCount
if op == '!=':
return rowCount != filterCount
return rowCount == filterCount
def rowBooleanFilterMatch(rowBoolean, filterBoolean):
if not isinstance(rowBoolean, bool):
return False
return rowBoolean == filterBoolean
def headerFilterMatch(filters, title): if GC_Values[GC_CSV_ROW_FILTER]:
for filterStr in filters: for column, filterVal in iter(GC_Values[GC_CSV_ROW_FILTER].items()):
if filterStr.match(title): if column not in titles:
return True sys.stderr.write(
return False f'WARNING: Row filter column "{column}" is not in output columns\n'
)
if GC_Values[GC_CSV_ROW_FILTER]: continue
for column, filterVal in iter(GC_Values[GC_CSV_ROW_FILTER].items()): if filterVal[0] == 'regex':
if column not in titles: csvRows = [
sys.stderr.write(f'WARNING: Row filter column "{column}" is not in output columns\n') row for row in csvRows
continue if filterVal[1].search(str(row.get(column, '')))
if filterVal[0] == 'regex': ]
csvRows = [row for row in csvRows if filterVal[1].search(str(row.get(column, '')))] elif filterVal[0] == 'notregex':
elif filterVal[0] == 'notregex': csvRows = [
csvRows = [row for row in csvRows if not filterVal[1].search(str(row.get(column, '')))] row for row in csvRows
elif filterVal[0] in ['date', 'time']: if not filterVal[1].search(str(row.get(column, '')))
csvRows = [row for row in csvRows if rowDateTimeFilterMatch(filterVal[0] == 'date', row.get(column, ''), filterVal[1], filterVal[2])] ]
elif filterVal[0] == 'count': elif filterVal[0] in ['date', 'time']:
csvRows = [row for row in csvRows if rowCountFilterMatch(row.get(column, 0), filterVal[1], filterVal[2])] csvRows = [
else: #boolean row for row in csvRows if rowDateTimeFilterMatch(
csvRows = [row for row in csvRows if rowBooleanFilterMatch(row.get(column, False), filterVal[1])] filterVal[0] == 'date', row.get(column, ''),
if GC_Values[GC_CSV_HEADER_FILTER] or GC_Values[GC_CSV_HEADER_DROP_FILTER]: filterVal[1], filterVal[2])
if GC_Values[GC_CSV_HEADER_DROP_FILTER]: ]
titles = [t for t in titles if not headerFilterMatch(GC_Values[GC_CSV_HEADER_DROP_FILTER], t)] elif filterVal[0] == 'count':
if GC_Values[GC_CSV_HEADER_FILTER]: csvRows = [
titles = [t for t in titles if headerFilterMatch(GC_Values[GC_CSV_HEADER_FILTER], t)] row for row in csvRows if rowCountFilterMatch(
if not titles: row.get(column, 0), filterVal[1], filterVal[2])
controlflow.system_error_exit(3, 'No columns selected with GAM_CSV_HEADER_FILTER and GAM_CSV_HEADER_DROP_FILTER\n') ]
return else: #boolean
csv.register_dialect('nixstdout', lineterminator='\n') csvRows = [
if todrive: row for row in csvRows if rowBooleanFilterMatch(
write_to = io.StringIO() row.get(column, False), filterVal[1])
else: ]
write_to = sys.stdout if GC_Values[GC_CSV_HEADER_FILTER] or GC_Values[GC_CSV_HEADER_DROP_FILTER]:
writer = csv.DictWriter(write_to, fieldnames=titles, dialect='nixstdout', extrasaction='ignore', quoting=csv.QUOTE_MINIMAL) if GC_Values[GC_CSV_HEADER_DROP_FILTER]:
try: titles = [
writer.writerow(dict((item, item) for item in writer.fieldnames)) t for t in titles if
writer.writerows(csvRows) not headerFilterMatch(GC_Values[GC_CSV_HEADER_DROP_FILTER], t)
except IOError as e: ]
controlflow.system_error_exit(6, e) if GC_Values[GC_CSV_HEADER_FILTER]:
if todrive: titles = [
admin_email = gam._getValueFromOAuth('email') t for t in titles
_, drive = gam.buildDrive3GAPIObject(admin_email) if headerFilterMatch(GC_Values[GC_CSV_HEADER_FILTER], t)
if not drive: ]
print(f'''\nGAM is not authorized to create Drive files. Please run: if not titles:
controlflow.system_error_exit(
3,
'No columns selected with GAM_CSV_HEADER_FILTER and GAM_CSV_HEADER_DROP_FILTER\n'
)
return
csv.register_dialect('nixstdout', lineterminator='\n')
if todrive:
write_to = io.StringIO()
else:
write_to = sys.stdout
writer = csv.DictWriter(write_to,
fieldnames=titles,
dialect='nixstdout',
extrasaction='ignore',
quoting=csv.QUOTE_MINIMAL)
try:
writer.writerow(dict((item, item) for item in writer.fieldnames))
writer.writerows(csvRows)
except IOError as e:
controlflow.system_error_exit(6, e)
if todrive:
admin_email = gam._getValueFromOAuth('email')
_, drive = gam.buildDrive3GAPIObject(admin_email)
if not drive:
print(f'''\nGAM is not authorized to create Drive files. Please run:
gam user {admin_email} check serviceaccount gam user {admin_email} check serviceaccount
and follow recommend steps to authorize GAM for Drive access.''') and follow recommend steps to authorize GAM for Drive access.''')
sys.exit(5) sys.exit(5)
result = gapi.call(drive.about(), 'get', fields='maxImportSizes') result = gapi.call(drive.about(), 'get', fields='maxImportSizes')
columns = len(titles) columns = len(titles)
rows = len(csvRows) rows = len(csvRows)
cell_count = rows * columns cell_count = rows * columns
data_size = len(write_to.getvalue()) data_size = len(write_to.getvalue())
max_sheet_bytes = int(result['maxImportSizes'][MIMETYPE_GA_SPREADSHEET]) max_sheet_bytes = int(result['maxImportSizes'][MIMETYPE_GA_SPREADSHEET])
if cell_count > MAX_GOOGLE_SHEET_CELLS or data_size > max_sheet_bytes: if cell_count > MAX_GOOGLE_SHEET_CELLS or data_size > max_sheet_bytes:
print(f'{WARNING_PREFIX}{MESSAGE_RESULTS_TOO_LARGE_FOR_GOOGLE_SPREADSHEET}') print(
mimeType = 'text/csv' f'{WARNING_PREFIX}{MESSAGE_RESULTS_TOO_LARGE_FOR_GOOGLE_SPREADSHEET}'
else: )
mimeType = MIMETYPE_GA_SPREADSHEET mimeType = 'text/csv'
body = {'description': QuotedArgumentList(sys.argv), else:
mimeType = MIMETYPE_GA_SPREADSHEET
body = {
'description': QuotedArgumentList(sys.argv),
'name': f'{GC_Values[GC_DOMAIN]} - {list_type}', 'name': f'{GC_Values[GC_DOMAIN]} - {list_type}',
'mimeType': mimeType} 'mimeType': mimeType
result = gapi.call(drive.files(), 'create', fields='webViewLink', }
body=body, result = gapi.call(drive.files(),
media_body=googleapiclient.http.MediaInMemoryUpload(write_to.getvalue().encode(), 'create',
mimetype='text/csv')) fields='webViewLink',
file_url = result['webViewLink'] body=body,
if GC_Values[GC_NO_BROWSER]: media_body=googleapiclient.http.MediaInMemoryUpload(
msg_txt = f'Drive file uploaded to:\n {file_url}' write_to.getvalue().encode(),
msg_subj = f'{GC_Values[GC_DOMAIN]} - {list_type}' mimetype='text/csv'))
gam.send_email(msg_subj, msg_txt) file_url = result['webViewLink']
print(msg_txt) if GC_Values[GC_NO_BROWSER]:
else: msg_txt = f'Drive file uploaded to:\n {file_url}'
webbrowser.open(file_url) msg_subj = f'{GC_Values[GC_DOMAIN]} - {list_type}'
gam.send_email(msg_subj, msg_txt)
print(msg_txt)
else:
webbrowser.open(file_url)
def print_error(message): def print_error(message):
"""Prints a one-line error message to stderr in a standard format.""" """Prints a one-line error message to stderr in a standard format."""
sys.stderr.write('\n{0}{1}\n'.format(ERROR_PREFIX, message)) sys.stderr.write('\n{0}{1}\n'.format(ERROR_PREFIX, message))
def print_warning(message): def print_warning(message):
"""Prints a one-line warning message to stderr in a standard format.""" """Prints a one-line warning message to stderr in a standard format."""
sys.stderr.write('\n{0}{1}\n'.format(WARNING_PREFIX, message)) sys.stderr.write('\n{0}{1}\n'.format(WARNING_PREFIX, message))
def print_json(object_value, spacing=''): def print_json(object_value, spacing=''):
"""Prints Dict or Array to screen in clean human-readable format..""" """Prints Dict or Array to screen in clean human-readable format.."""
if isinstance(object_value, list): if isinstance(object_value, list):
if len(object_value) == 1 and isinstance(object_value[0], (str, int, bool)): if len(object_value) == 1 and isinstance(object_value[0],
sys.stdout.write(f'{object_value[0]}\n') (str, int, bool)):
return sys.stdout.write(f'{object_value[0]}\n')
if spacing: return
sys.stdout.write('\n') if spacing:
for i, a_value in enumerate(object_value): sys.stdout.write('\n')
if isinstance(a_value, (str, int, bool)): for i, a_value in enumerate(object_value):
sys.stdout.write(f' {spacing}{i+1}) {a_value}\n') if isinstance(a_value, (str, int, bool)):
else: sys.stdout.write(f' {spacing}{i+1}) {a_value}\n')
sys.stdout.write(f' {spacing}{i+1}) ') else:
print_json(a_value, f' {spacing}') sys.stdout.write(f' {spacing}{i+1}) ')
elif isinstance(object_value, dict): print_json(a_value, f' {spacing}')
for key in ['kind', 'etag', 'etags']: elif isinstance(object_value, dict):
object_value.pop(key, None) for key in ['kind', 'etag', 'etags']:
for another_object, another_value in object_value.items(): object_value.pop(key, None)
sys.stdout.write(f' {spacing}{another_object}: ') for another_object, another_value in object_value.items():
print_json(another_value, f' {spacing}') sys.stdout.write(f' {spacing}{another_object}: ')
else: print_json(another_value, f' {spacing}')
sys.stdout.write(f'{object_value}\n') else:
sys.stdout.write(f'{object_value}\n')

View File

@ -10,50 +10,50 @@ from gam.var import WARNING_PREFIX
class DisplayTest(unittest.TestCase): class DisplayTest(unittest.TestCase):
def test_print_error_prints_to_stderr(self): def test_print_error_prints_to_stderr(self):
message = 'test error' message = 'test error'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertIn(message, printed_message) self.assertIn(message, printed_message)
def test_print_error_prints_error_prefix(self): def test_print_error_prints_error_prefix(self):
message = 'test error' message = 'test error'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertLess( self.assertLess(
printed_message.find(ERROR_PREFIX), printed_message.find(message), printed_message.find(ERROR_PREFIX), printed_message.find(message),
'The error prefix does not appear before the error message') 'The error prefix does not appear before the error message')
def test_print_error_ends_message_with_newline(self): def test_print_error_ends_message_with_newline(self):
message = 'test error' message = 'test error'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertRegex(printed_message, '\n$', self.assertRegex(printed_message, '\n$',
'The error message does not end in a newline.') 'The error message does not end in a newline.')
def test_print_warning_prints_to_stderr(self): def test_print_warning_prints_to_stderr(self):
message = 'test warning' message = 'test warning'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertIn(message, printed_message) self.assertIn(message, printed_message)
def test_print_warning_prints_error_prefix(self): def test_print_warning_prints_error_prefix(self):
message = 'test warning' message = 'test warning'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertLess( self.assertLess(
printed_message.find(WARNING_PREFIX), printed_message.find(message), printed_message.find(WARNING_PREFIX), printed_message.find(message),
'The warning prefix does not appear before the error message') 'The warning prefix does not appear before the error message')
def test_print_warning_ends_message_with_newline(self): def test_print_warning_ends_message_with_newline(self):
message = 'test warning' message = 'test warning'
with patch.object(display.sys.stderr, 'write') as mock_write: with patch.object(display.sys.stderr, 'write') as mock_write:
display.print_error(message) display.print_error(message)
printed_message = mock_write.call_args[0][0] printed_message = mock_write.call_args[0][0]
self.assertRegex(printed_message, '\n$', self.assertRegex(printed_message, '\n$',
'The warning message does not end in a newline.') 'The warning message does not end in a newline.')

View File

@ -12,17 +12,19 @@ from gam.var import UTF8_SIG
def _open_file(filename, mode, encoding=None, newline=None): def _open_file(filename, mode, encoding=None, newline=None):
"""Opens a file with no error handling.""" """Opens a file with no error handling."""
# Determine which encoding to use # Determine which encoding to use
if 'b' in mode: if 'b' in mode:
encoding = None encoding = None
elif not encoding: elif not encoding:
encoding = GM_Globals[GM_SYS_ENCODING] encoding = GM_Globals[GM_SYS_ENCODING]
elif 'r' in mode and encoding.lower().replace('-', '') == 'utf8': elif 'r' in mode and encoding.lower().replace('-', '') == 'utf8':
encoding = UTF8_SIG encoding = UTF8_SIG
return open( return open(os.path.expanduser(filename),
os.path.expanduser(filename), mode, newline=newline, encoding=encoding) mode,
newline=newline,
encoding=encoding)
def open_file(filename, def open_file(filename,
@ -30,7 +32,7 @@ def open_file(filename,
encoding=None, encoding=None,
newline=None, newline=None,
strip_utf_bom=False): strip_utf_bom=False):
"""Opens a file. """Opens a file.
Args: Args:
filename: String, the name of the file to open, or '-' to use stdin/stdout, filename: String, the name of the file to open, or '-' to use stdin/stdout,
@ -47,41 +49,42 @@ def open_file(filename,
Returns: Returns:
The opened file. The opened file.
""" """
try: try:
if filename == '-': if filename == '-':
# Read from stdin, rather than a file # Read from stdin, rather than a file
if 'r' in mode: if 'r' in mode:
return io.StringIO(str(sys.stdin.read())) return io.StringIO(str(sys.stdin.read()))
return sys.stdout return sys.stdout
# Open a file on disk # Open a file on disk
f = _open_file(filename, mode, newline=newline, encoding=encoding) f = _open_file(filename, mode, newline=newline, encoding=encoding)
if strip_utf_bom: if strip_utf_bom:
utf_bom = u'\ufeff' utf_bom = u'\ufeff'
has_bom = False has_bom = False
if 'b' in mode: if 'b' in mode:
has_bom = f.read(3).decode('UTF-8') == utf_bom has_bom = f.read(3).decode('UTF-8') == utf_bom
elif f.encoding and not f.encoding.lower().startswith('utf'): elif f.encoding and not f.encoding.lower().startswith('utf'):
# Convert UTF BOM into ISO-8859-1 via Bytes # Convert UTF BOM into ISO-8859-1 via Bytes
utf8_bom_bytes = utf_bom.encode('UTF-8') utf8_bom_bytes = utf_bom.encode('UTF-8')
iso_8859_1_bom = utf8_bom_bytes.decode('iso-8859-1').encode( iso_8859_1_bom = utf8_bom_bytes.decode('iso-8859-1').encode(
'iso-8859-1') 'iso-8859-1')
has_bom = f.read(3).encode('iso-8859-1', 'replace') == iso_8859_1_bom has_bom = f.read(3).encode('iso-8859-1',
else: 'replace') == iso_8859_1_bom
has_bom = f.read(1) == utf_bom else:
has_bom = f.read(1) == utf_bom
if not has_bom: if not has_bom:
f.seek(0) f.seek(0)
return f return f
except IOError as e: except IOError as e:
controlflow.system_error_exit(6, e) controlflow.system_error_exit(6, e)
def close_file(f, force_flush=False): def close_file(f, force_flush=False):
"""Closes a file. """Closes a file.
Args: Args:
f: The file to close f: The file to close
@ -92,15 +95,15 @@ def close_file(f, force_flush=False):
Boolean, True if the file was successfully closed. False if an error Boolean, True if the file was successfully closed. False if an error
was encountered while closing. was encountered while closing.
""" """
if force_flush: if force_flush:
f.flush() f.flush()
os.fsync(f.fileno()) os.fsync(f.fileno())
try: try:
f.close() f.close()
return True return True
except IOError as e: except IOError as e:
display.print_error(e) display.print_error(e)
return False return False
def read_file(filename, def read_file(filename,
@ -109,7 +112,7 @@ def read_file(filename,
newline=None, newline=None,
continue_on_error=False, continue_on_error=False,
display_errors=True): display_errors=True):
"""Reads a file from disk. """Reads a file from disk.
Args: Args:
filename: String, the path of the file to open from disk, or "-" to read filename: String, the path of the file to open from disk, or "-" to read
@ -128,22 +131,23 @@ def read_file(filename,
The contents of the file, or stdin if filename == "-". Returns None if The contents of the file, or stdin if filename == "-". Returns None if
an error is encountered and continue_on_errors is True. an error is encountered and continue_on_errors is True.
""" """
try: try:
if filename == '-': if filename == '-':
# Read from stdin, rather than a file. # Read from stdin, rather than a file.
return str(sys.stdin.read()) return str(sys.stdin.read())
with _open_file(filename, mode, newline=newline, encoding=encoding) as f: with _open_file(filename, mode, newline=newline,
return f.read() encoding=encoding) as f:
return f.read()
except IOError as e: except IOError as e:
if continue_on_error: if continue_on_error:
if display_errors: if display_errors:
display.print_warning(e) display.print_warning(e)
return None return None
controlflow.system_error_exit(6, e) controlflow.system_error_exit(6, e)
except (LookupError, UnicodeDecodeError, UnicodeError) as e: except (LookupError, UnicodeDecodeError, UnicodeError) as e:
controlflow.system_error_exit(2, str(e)) controlflow.system_error_exit(2, str(e))
def write_file(filename, def write_file(filename,
@ -151,7 +155,7 @@ def write_file(filename,
mode='w', mode='w',
continue_on_error=False, continue_on_error=False,
display_errors=True): display_errors=True):
"""Writes data to a file. """Writes data to a file.
Args: Args:
filename: String, the path of the file to write to disk. filename: String, the path of the file to write to disk.
@ -165,15 +169,15 @@ def write_file(filename,
Returns: Returns:
Boolean, True if the write operation succeeded, or False if not. Boolean, True if the write operation succeeded, or False if not.
""" """
try: try:
with _open_file(filename, mode) as f: with _open_file(filename, mode) as f:
f.write(data) f.write(data)
return True return True
except IOError as e: except IOError as e:
if continue_on_error: if continue_on_error:
if display_errors: if display_errors:
display.print_error(e) display.print_error(e)
return False return False
else: else:
controlflow.system_error_exit(6, e) controlflow.system_error_exit(6, e)

View File

@ -11,224 +11,234 @@ from gam import fileutils
class FileutilsTest(unittest.TestCase): class FileutilsTest(unittest.TestCase):
def setUp(self): def setUp(self):
self.fake_path = '/some/path/to/file' self.fake_path = '/some/path/to/file'
super(FileutilsTest, self).setUp() super(FileutilsTest, self).setUp()
@patch.object(fileutils.sys, 'stdin') @patch.object(fileutils.sys, 'stdin')
def test_open_file_stdin(self, mock_stdin): def test_open_file_stdin(self, mock_stdin):
mock_stdin.read.return_value = 'some stdin content' mock_stdin.read.return_value = 'some stdin content'
f = fileutils.open_file('-', mode='r') f = fileutils.open_file('-', mode='r')
self.assertIsInstance(f, fileutils.io.StringIO) self.assertIsInstance(f, fileutils.io.StringIO)
self.assertEqual(f.getvalue(), mock_stdin.read.return_value) self.assertEqual(f.getvalue(), mock_stdin.read.return_value)
def test_open_file_stdout(self): def test_open_file_stdout(self):
f = fileutils.open_file('-', mode='w') f = fileutils.open_file('-', mode='w')
self.assertEqual(fileutils.sys.stdout, f) self.assertEqual(fileutils.sys.stdout, f)
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_opens_correct_path(self, mock_open): def test_open_file_opens_correct_path(self, mock_open):
f = fileutils.open_file(self.fake_path) f = fileutils.open_file(self.fake_path)
self.assertEqual(self.fake_path, mock_open.call_args[0][0]) self.assertEqual(self.fake_path, mock_open.call_args[0][0])
self.assertEqual(mock_open.return_value, f) self.assertEqual(mock_open.return_value, f)
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_expands_user_file_path(self, mock_open): def test_open_file_expands_user_file_path(self, mock_open):
file_path = '~/some/path/containing/tilde/shortcut/to/home' file_path = '~/some/path/containing/tilde/shortcut/to/home'
fileutils.open_file(file_path) fileutils.open_file(file_path)
opened_path = mock_open.call_args[0][0] opened_path = mock_open.call_args[0][0]
home_path = os.environ.get('HOME') home_path = os.environ.get('HOME')
self.assertIsNotNone(home_path) self.assertIsNotNone(home_path)
self.assertIn(home_path, opened_path) self.assertIn(home_path, opened_path)
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_opens_correct_mode(self, mock_open): def test_open_file_opens_correct_mode(self, mock_open):
fileutils.open_file(self.fake_path) fileutils.open_file(self.fake_path)
self.assertEqual('r', mock_open.call_args[0][1]) self.assertEqual('r', mock_open.call_args[0][1])
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_encoding_for_binary(self, mock_open): def test_open_file_encoding_for_binary(self, mock_open):
fileutils.open_file(self.fake_path, mode='b') fileutils.open_file(self.fake_path, mode='b')
self.assertIsNone(mock_open.call_args[1]['encoding']) self.assertIsNone(mock_open.call_args[1]['encoding'])
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_default_system_encoding(self, mock_open): def test_open_file_default_system_encoding(self, mock_open):
fileutils.open_file(self.fake_path) fileutils.open_file(self.fake_path)
self.assertEqual(fileutils.GM_Globals[fileutils.GM_SYS_ENCODING], self.assertEqual(fileutils.GM_Globals[fileutils.GM_SYS_ENCODING],
mock_open.call_args[1]['encoding']) mock_open.call_args[1]['encoding'])
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_utf8_encoding_specified(self, mock_open): def test_open_file_utf8_encoding_specified(self, mock_open):
fileutils.open_file(self.fake_path, encoding='UTF-8') fileutils.open_file(self.fake_path, encoding='UTF-8')
self.assertEqual(fileutils.UTF8_SIG, mock_open.call_args[1]['encoding']) self.assertEqual(fileutils.UTF8_SIG, mock_open.call_args[1]['encoding'])
def test_open_file_strips_utf_bom_in_utf(self): def test_open_file_strips_utf_bom_in_utf(self):
bom_prefixed_data = u'\ufefffoobar' bom_prefixed_data = u'\ufefffoobar'
fake_file = io.StringIO(bom_prefixed_data) fake_file = io.StringIO(bom_prefixed_data)
mock_open = MagicMock(spec=open, return_value=fake_file) mock_open = MagicMock(spec=open, return_value=fake_file)
with patch.object(fileutils, 'open', mock_open): with patch.object(fileutils, 'open', mock_open):
f = fileutils.open_file(self.fake_path, strip_utf_bom=True) f = fileutils.open_file(self.fake_path, strip_utf_bom=True)
self.assertEqual('foobar', f.read()) self.assertEqual('foobar', f.read())
def test_open_file_strips_utf_bom_in_non_utf(self): def test_open_file_strips_utf_bom_in_non_utf(self):
bom_prefixed_data = b'\xef\xbb\xbffoobar'.decode('iso-8859-1') bom_prefixed_data = b'\xef\xbb\xbffoobar'.decode('iso-8859-1')
# We need to trick the method under test into believing that a StringIO # We need to trick the method under test into believing that a StringIO
# instance is a file with an encoding. Since StringIO does not usually have, # instance is a file with an encoding. Since StringIO does not usually have,
# an encoding, we'll mock it and add our own encoding, but send the other # an encoding, we'll mock it and add our own encoding, but send the other
# methods in use (read and seek) back to the real StringIO object. # methods in use (read and seek) back to the real StringIO object.
real_stringio = io.StringIO(bom_prefixed_data) real_stringio = io.StringIO(bom_prefixed_data)
mock_file = MagicMock(spec=io.StringIO) mock_file = MagicMock(spec=io.StringIO)
mock_file.read.side_effect = real_stringio.read mock_file.read.side_effect = real_stringio.read
mock_file.seek.side_effect = real_stringio.seek mock_file.seek.side_effect = real_stringio.seek
mock_file.encoding = 'iso-8859-1' mock_file.encoding = 'iso-8859-1'
mock_open = MagicMock(spec=open, return_value=mock_file) mock_open = MagicMock(spec=open, return_value=mock_file)
with patch.object(fileutils, 'open', mock_open): with patch.object(fileutils, 'open', mock_open):
f = fileutils.open_file(self.fake_path, strip_utf_bom=True) f = fileutils.open_file(self.fake_path, strip_utf_bom=True)
self.assertEqual('foobar', f.read()) self.assertEqual('foobar', f.read())
def test_open_file_strips_utf_bom_in_binary(self): def test_open_file_strips_utf_bom_in_binary(self):
bom_prefixed_data = u'\ufefffoobar'.encode('UTF-8') bom_prefixed_data = u'\ufefffoobar'.encode('UTF-8')
fake_file = io.BytesIO(bom_prefixed_data) fake_file = io.BytesIO(bom_prefixed_data)
mock_open = MagicMock(spec=open, return_value=fake_file) mock_open = MagicMock(spec=open, return_value=fake_file)
with patch.object(fileutils, 'open', mock_open): with patch.object(fileutils, 'open', mock_open):
f = fileutils.open_file(self.fake_path, mode='rb', strip_utf_bom=True) f = fileutils.open_file(self.fake_path,
self.assertEqual(b'foobar', f.read()) mode='rb',
strip_utf_bom=True)
self.assertEqual(b'foobar', f.read())
def test_open_file_strip_utf_bom_when_no_bom_in_data(self): def test_open_file_strip_utf_bom_when_no_bom_in_data(self):
no_bom_data = 'This data has no BOM' no_bom_data = 'This data has no BOM'
fake_file = io.StringIO(no_bom_data) fake_file = io.StringIO(no_bom_data)
mock_open = MagicMock(spec=open, return_value=fake_file) mock_open = MagicMock(spec=open, return_value=fake_file)
with patch.object(fileutils, 'open', mock_open): with patch.object(fileutils, 'open', mock_open):
f = fileutils.open_file(self.fake_path, strip_utf_bom=True) f = fileutils.open_file(self.fake_path, strip_utf_bom=True)
# Since there was no opening BOM, we should be back at the beginning of # Since there was no opening BOM, we should be back at the beginning of
# the file. # the file.
self.assertEqual(fake_file.tell(), 0) self.assertEqual(fake_file.tell(), 0)
self.assertEqual(f.read(), no_bom_data) self.assertEqual(f.read(), no_bom_data)
@patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open) @patch.object(fileutils, 'open', new_callable=unittest.mock.mock_open)
def test_open_file_exits_on_io_error(self, mock_open): def test_open_file_exits_on_io_error(self, mock_open):
mock_open.side_effect = IOError('Fake IOError') mock_open.side_effect = IOError('Fake IOError')
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
fileutils.open_file(self.fake_path) fileutils.open_file(self.fake_path)
self.assertEqual(context.exception.code, 6) self.assertEqual(context.exception.code, 6)
def test_close_file_closes_file_successfully(self): def test_close_file_closes_file_successfully(self):
mock_file = MagicMock() mock_file = MagicMock()
self.assertTrue(fileutils.close_file(mock_file)) self.assertTrue(fileutils.close_file(mock_file))
self.assertEqual(mock_file.close.call_count, 1) self.assertEqual(mock_file.close.call_count, 1)
def test_close_file_with_error(self): def test_close_file_with_error(self):
mock_file = MagicMock() mock_file = MagicMock()
mock_file.close.side_effect = IOError() mock_file.close.side_effect = IOError()
self.assertFalse(fileutils.close_file(mock_file)) self.assertFalse(fileutils.close_file(mock_file))
self.assertEqual(mock_file.close.call_count, 1) self.assertEqual(mock_file.close.call_count, 1)
@patch.object(fileutils.sys, 'stdin') @patch.object(fileutils.sys, 'stdin')
def test_read_file_from_stdin(self, mock_stdin): def test_read_file_from_stdin(self, mock_stdin):
mock_stdin.read.return_value = 'some stdin content' mock_stdin.read.return_value = 'some stdin content'
self.assertEqual(fileutils.read_file('-'), mock_stdin.read.return_value) self.assertEqual(fileutils.read_file('-'), mock_stdin.read.return_value)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_default_params(self, mock_open_file): def test_read_file_default_params(self, mock_open_file):
fake_content = 'some fake content' fake_content = 'some fake content'
mock_open_file.return_value.__enter__().read.return_value = fake_content mock_open_file.return_value.__enter__().read.return_value = fake_content
self.assertEqual(fileutils.read_file(self.fake_path), fake_content) self.assertEqual(fileutils.read_file(self.fake_path), fake_content)
self.assertEqual(mock_open_file.call_args[0][0], self.fake_path) self.assertEqual(mock_open_file.call_args[0][0], self.fake_path)
self.assertEqual(mock_open_file.call_args[0][1], 'r') self.assertEqual(mock_open_file.call_args[0][1], 'r')
self.assertIsNone(mock_open_file.call_args[1]['newline']) self.assertIsNone(mock_open_file.call_args[1]['newline'])
@patch.object(fileutils.display, 'print_warning') @patch.object(fileutils.display, 'print_warning')
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_continues_on_errors_without_displaying( def test_read_file_continues_on_errors_without_displaying(
self, mock_open_file, mock_print_warning): self, mock_open_file, mock_print_warning):
mock_open_file.side_effect = IOError() mock_open_file.side_effect = IOError()
contents = fileutils.read_file( contents = fileutils.read_file(self.fake_path,
self.fake_path, continue_on_error=True, display_errors=False) continue_on_error=True,
self.assertIsNone(contents) display_errors=False)
self.assertFalse(mock_print_warning.called) self.assertIsNone(contents)
self.assertFalse(mock_print_warning.called)
@patch.object(fileutils.display, 'print_warning') @patch.object(fileutils.display, 'print_warning')
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_displays_errors(self, mock_open_file, mock_print_warning): def test_read_file_displays_errors(self, mock_open_file,
mock_open_file.side_effect = IOError() mock_print_warning):
fileutils.read_file( mock_open_file.side_effect = IOError()
self.fake_path, continue_on_error=True, display_errors=True) fileutils.read_file(self.fake_path,
self.assertTrue(mock_print_warning.called) continue_on_error=True,
display_errors=True)
self.assertTrue(mock_print_warning.called)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_exits_code_6_when_continue_on_error_is_false( def test_read_file_exits_code_6_when_continue_on_error_is_false(
self, mock_open_file): self, mock_open_file):
mock_open_file.side_effect = IOError() mock_open_file.side_effect = IOError()
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
fileutils.read_file(self.fake_path, continue_on_error=False) fileutils.read_file(self.fake_path, continue_on_error=False)
self.assertEqual(context.exception.code, 6) self.assertEqual(context.exception.code, 6)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_exits_code_2_on_lookuperror(self, mock_open_file): def test_read_file_exits_code_2_on_lookuperror(self, mock_open_file):
mock_open_file.return_value.__enter__().read.side_effect = LookupError() mock_open_file.return_value.__enter__().read.side_effect = LookupError()
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
fileutils.read_file(self.fake_path) fileutils.read_file(self.fake_path)
self.assertEqual(context.exception.code, 2) self.assertEqual(context.exception.code, 2)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_exits_code_2_on_unicodeerror(self, mock_open_file): def test_read_file_exits_code_2_on_unicodeerror(self, mock_open_file):
mock_open_file.return_value.__enter__().read.side_effect = UnicodeError() mock_open_file.return_value.__enter__().read.side_effect = UnicodeError(
with self.assertRaises(SystemExit) as context: )
fileutils.read_file(self.fake_path) with self.assertRaises(SystemExit) as context:
self.assertEqual(context.exception.code, 2) fileutils.read_file(self.fake_path)
self.assertEqual(context.exception.code, 2)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_read_file_exits_code_2_on_unicodedecodeerror(self, mock_open_file): def test_read_file_exits_code_2_on_unicodedecodeerror(self, mock_open_file):
fake_decode_error = UnicodeDecodeError('fake-encoding', b'fakebytes', 0, 1, fake_decode_error = UnicodeDecodeError('fake-encoding', b'fakebytes', 0,
'testing only') 1, 'testing only')
mock_open_file.return_value.__enter__().read.side_effect = fake_decode_error mock_open_file.return_value.__enter__(
with self.assertRaises(SystemExit) as context: ).read.side_effect = fake_decode_error
fileutils.read_file(self.fake_path) with self.assertRaises(SystemExit) as context:
self.assertEqual(context.exception.code, 2) fileutils.read_file(self.fake_path)
self.assertEqual(context.exception.code, 2)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_write_file_writes_data_to_file(self, mock_open_file): def test_write_file_writes_data_to_file(self, mock_open_file):
fake_data = 'some fake data' fake_data = 'some fake data'
fileutils.write_file(self.fake_path, fake_data) fileutils.write_file(self.fake_path, fake_data)
self.assertEqual(mock_open_file.call_args[0][0], self.fake_path) self.assertEqual(mock_open_file.call_args[0][0], self.fake_path)
self.assertEqual(mock_open_file.call_args[0][1], 'w') self.assertEqual(mock_open_file.call_args[0][1], 'w')
opened_file = mock_open_file.return_value.__enter__() opened_file = mock_open_file.return_value.__enter__()
self.assertTrue(opened_file.write.called) self.assertTrue(opened_file.write.called)
self.assertEqual(opened_file.write.call_args[0][0], fake_data) self.assertEqual(opened_file.write.call_args[0][0], fake_data)
@patch.object(fileutils.display, 'print_error') @patch.object(fileutils.display, 'print_error')
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_write_file_continues_on_errors_without_displaying( def test_write_file_continues_on_errors_without_displaying(
self, mock_open_file, mock_print_error): self, mock_open_file, mock_print_error):
mock_open_file.side_effect = IOError() mock_open_file.side_effect = IOError()
status = fileutils.write_file( status = fileutils.write_file(self.fake_path,
self.fake_path, 'foo data',
'foo data', continue_on_error=True,
continue_on_error=True, display_errors=False)
display_errors=False) self.assertFalse(status)
self.assertFalse(status) self.assertFalse(mock_print_error.called)
self.assertFalse(mock_print_error.called)
@patch.object(fileutils.display, 'print_error') @patch.object(fileutils.display, 'print_error')
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_write_file_displays_errors(self, mock_open_file, mock_print_error): def test_write_file_displays_errors(self, mock_open_file, mock_print_error):
mock_open_file.side_effect = IOError() mock_open_file.side_effect = IOError()
fileutils.write_file( fileutils.write_file(self.fake_path,
self.fake_path, 'foo data', continue_on_error=True, display_errors=True) 'foo data',
self.assertTrue(mock_print_error.called) continue_on_error=True,
display_errors=True)
self.assertTrue(mock_print_error.called)
@patch.object(fileutils, '_open_file') @patch.object(fileutils, '_open_file')
def test_write_file_exits_code_6_when_continue_on_error_is_false( def test_write_file_exits_code_6_when_continue_on_error_is_false(
self, mock_open_file): self, mock_open_file):
mock_open_file.side_effect = IOError() mock_open_file.side_effect = IOError()
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
fileutils.write_file(self.fake_path, 'foo data', continue_on_error=False) fileutils.write_file(self.fake_path,
self.assertEqual(context.exception.code, 6) 'foo data',
continue_on_error=False)
self.assertEqual(context.exception.code, 6)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -23,7 +23,7 @@ def call(service,
throw_reasons=None, throw_reasons=None,
retry_reasons=None, retry_reasons=None,
**kwargs): **kwargs):
"""Executes a single request on a Google service function. """Executes a single request on a Google service function.
Args: Args:
service: A Google service object for the desired API. service: A Google service object for the desired API.
@ -42,68 +42,76 @@ def call(service,
Returns: Returns:
A response object for the corresponding Google API call. A response object for the corresponding Google API call.
""" """
if throw_reasons is None: if throw_reasons is None:
throw_reasons = [] throw_reasons = []
if retry_reasons is None: if retry_reasons is None:
retry_reasons = [] retry_reasons = []
method = getattr(service, function) method = getattr(service, function)
retries = 10 retries = 10
parameters = dict( parameters = dict(
list(kwargs.items()) + list(GM_Globals[GM_EXTRA_ARGS_DICT].items())) list(kwargs.items()) + list(GM_Globals[GM_EXTRA_ARGS_DICT].items()))
for n in range(1, retries + 1): for n in range(1, retries + 1):
try: try:
return method(**parameters).execute() return method(**parameters).execute()
except googleapiclient.errors.HttpError as e: except googleapiclient.errors.HttpError as e:
http_status, reason, message = errors.get_gapi_error_detail( http_status, reason, message = errors.get_gapi_error_detail(
e, e,
soft_errors=soft_errors, soft_errors=soft_errors,
silent_errors=silent_errors, silent_errors=silent_errors,
retry_on_http_error=n < 3) retry_on_http_error=n < 3)
if http_status == -1: if http_status == -1:
# The error detail indicated that we should retry this request # The error detail indicated that we should retry this request
# We'll refresh credentials and make another pass # We'll refresh credentials and make another pass
service._http.credentials.refresh(transport.create_http()) service._http.credentials.refresh(transport.create_http())
continue continue
if http_status == 0: if http_status == 0:
return None return None
is_known_error_reason = reason in [r.value for r in errors.ErrorReason] is_known_error_reason = reason in [
if is_known_error_reason and errors.ErrorReason(reason) in throw_reasons: r.value for r in errors.ErrorReason
if errors.ErrorReason(reason) in errors.ERROR_REASON_TO_EXCEPTION: ]
raise errors.ERROR_REASON_TO_EXCEPTION[errors.ErrorReason(reason)]( if is_known_error_reason and errors.ErrorReason(
message) reason) in throw_reasons:
raise e if errors.ErrorReason(
if (n != retries) and (is_known_error_reason and errors.ErrorReason( reason) in errors.ERROR_REASON_TO_EXCEPTION:
reason) in errors.DEFAULT_RETRY_REASONS + retry_reasons): raise errors.ERROR_REASON_TO_EXCEPTION[errors.ErrorReason(
controlflow.wait_on_failure(n, retries, reason) reason)](message)
continue raise e
if soft_errors: if (n != retries) and (is_known_error_reason and errors.ErrorReason(
display.print_error(f'{http_status}: {message} - {reason}{["", ": Giving up."][n > 1]}') reason) in errors.DEFAULT_RETRY_REASONS + retry_reasons):
return None controlflow.wait_on_failure(n, retries, reason)
controlflow.system_error_exit( continue
int(http_status), f'{http_status}: {message} - {reason}') if soft_errors:
except google.auth.exceptions.RefreshError as e: display.print_error(
handle_oauth_token_error( f'{http_status}: {message} - {reason}{["", ": Giving up."][n > 1]}'
e, soft_errors or )
errors.ErrorReason.SERVICE_NOT_AVAILABLE in throw_reasons) return None
if errors.ErrorReason.SERVICE_NOT_AVAILABLE in throw_reasons: controlflow.system_error_exit(
raise errors.GapiServiceNotAvailableError(str(e)) int(http_status), f'{http_status}: {message} - {reason}')
display.print_error(f'User {GM_Globals[GM_CURRENT_API_USER]}: {str(e)}') except google.auth.exceptions.RefreshError as e:
return None handle_oauth_token_error(
except ValueError as e: e, soft_errors or
if hasattr(service._http, 'cache') and service._http.cache is not None: errors.ErrorReason.SERVICE_NOT_AVAILABLE in throw_reasons)
service._http.cache = None if errors.ErrorReason.SERVICE_NOT_AVAILABLE in throw_reasons:
continue raise errors.GapiServiceNotAvailableError(str(e))
controlflow.system_error_exit(4, str(e)) display.print_error(
except (httplib2.ServerNotFoundError, RuntimeError) as e: f'User {GM_Globals[GM_CURRENT_API_USER]}: {str(e)}')
if n != retries: return None
service._http.connections = {} except ValueError as e:
controlflow.wait_on_failure(n, retries, str(e)) if hasattr(service._http,
continue 'cache') and service._http.cache is not None:
controlflow.system_error_exit(4, str(e)) service._http.cache = None
except TypeError as e: continue
controlflow.system_error_exit(4, str(e)) controlflow.system_error_exit(4, str(e))
except (httplib2.ServerNotFoundError, RuntimeError) as e:
if n != retries:
service._http.connections = {}
controlflow.wait_on_failure(n, retries, str(e))
continue
controlflow.system_error_exit(4, str(e))
except TypeError as e:
controlflow.system_error_exit(4, str(e))
def get_items(service, def get_items(service,
@ -112,7 +120,7 @@ def get_items(service,
throw_reasons=None, throw_reasons=None,
retry_reasons=None, retry_reasons=None,
**kwargs): **kwargs):
"""Gets a single page of items from a Google service function that is paged. """Gets a single page of items from a Google service function that is paged.
Args: Args:
service: A Google service object for the desired API. service: A Google service object for the desired API.
@ -130,19 +138,18 @@ def get_items(service,
Returns: Returns:
The list of items in the first page of a response. The list of items in the first page of a response.
""" """
results = call( results = call(service,
service, function,
function, throw_reasons=throw_reasons,
throw_reasons=throw_reasons, retry_reasons=retry_reasons,
retry_reasons=retry_reasons, **kwargs)
**kwargs) if results:
if results: return results.get(items, [])
return results.get(items, []) return []
return []
def _get_max_page_size_for_api_call(service, function, **kwargs): def _get_max_page_size_for_api_call(service, function, **kwargs):
"""Gets the maximum number of results supported for a single API call. """Gets the maximum number of results supported for a single API call.
Args: Args:
service: A Google service object for the desired API. service: A Google service object for the desired API.
@ -153,31 +160,33 @@ def _get_max_page_size_for_api_call(service, function, **kwargs):
Int, A value from discovery if it exists, otherwise value from Int, A value from discovery if it exists, otherwise value from
MAX_RESULTS_API_EXCEPTIONS, otherwise None MAX_RESULTS_API_EXCEPTIONS, otherwise None
""" """
method = getattr(service, function) method = getattr(service, function)
api_id = method(**kwargs).methodId api_id = method(**kwargs).methodId
for resource in service._rootDesc.get('resources', {}).values(): for resource in service._rootDesc.get('resources', {}).values():
for a_method in resource.get('methods', {}).values(): for a_method in resource.get('methods', {}).values():
if a_method.get('id') == api_id: if a_method.get('id') == api_id:
if not a_method.get('parameters') or a_method['parameters'].get( if not a_method.get('parameters') or a_method['parameters'].get(
'pageSize') or not a_method['parameters'].get('maxResults'): 'pageSize'
# Make sure API call supports maxResults. For now we don't care to ) or not a_method['parameters'].get('maxResults'):
# set pageSize since all known pageSize API calls have # Make sure API call supports maxResults. For now we don't care to
# default pageSize == max pageSize. # set pageSize since all known pageSize API calls have
return None # default pageSize == max pageSize.
known_api_max = MAX_RESULTS_API_EXCEPTIONS.get(api_id) return None
max_results = a_method['parameters']['maxResults'].get( known_api_max = MAX_RESULTS_API_EXCEPTIONS.get(api_id)
'maximum', known_api_max) max_results = a_method['parameters']['maxResults'].get(
return {'maxResults': max_results} 'maximum', known_api_max)
return {'maxResults': max_results}
return None return None
TOTAL_ITEMS_MARKER = '%%total_items%%' TOTAL_ITEMS_MARKER = '%%total_items%%'
FIRST_ITEM_MARKER = '%%first_item%%' FIRST_ITEM_MARKER = '%%first_item%%'
LAST_ITEM_MARKER = '%%last_item%%' LAST_ITEM_MARKER = '%%last_item%%'
def got_total_items_msg(items, eol): def got_total_items_msg(items, eol):
"""Format a page_message to be used by get_all_pages """Format a page_message to be used by get_all_pages
The page message indicates the number of items returned The page message indicates the number of items returned
@ -190,10 +199,11 @@ def got_total_items_msg(items, eol):
The formatted page_message The formatted page_message
""" """
return f'Got {TOTAL_ITEMS_MARKER} {items}{eol}' return f'Got {TOTAL_ITEMS_MARKER} {items}{eol}'
def got_total_items_first_last_msg(items): def got_total_items_first_last_msg(items):
"""Format a page_message to be used by get_all_pages """Format a page_message to be used by get_all_pages
The page message indicates the number of items returned and the The page message indicates the number of items returned and the
value of the first and list items value of the first and list items
@ -205,7 +215,8 @@ def got_total_items_first_last_msg(items):
The formatted page_message The formatted page_message
""" """
return f'Got {TOTAL_ITEMS_MARKER} {items}: {FIRST_ITEM_MARKER} - {LAST_ITEM_MARKER}'+'\n' return f'Got {TOTAL_ITEMS_MARKER} {items}: {FIRST_ITEM_MARKER} - {LAST_ITEM_MARKER}' + '\n'
def get_all_pages(service, def get_all_pages(service,
function, function,
@ -216,7 +227,7 @@ def get_all_pages(service,
throw_reasons=None, throw_reasons=None,
retry_reasons=None, retry_reasons=None,
**kwargs): **kwargs):
"""Aggregates and returns all pages of a Google service function response. """Aggregates and returns all pages of a Google service function response.
All pages of items are aggregated and returned as a single list. All pages of items are aggregated and returned as a single list.
@ -250,79 +261,83 @@ def get_all_pages(service,
Returns: Returns:
A list of all items received from all paged responses. A list of all items received from all paged responses.
""" """
if 'maxResults' not in kwargs and 'pageSize' not in kwargs: if 'maxResults' not in kwargs and 'pageSize' not in kwargs:
page_key = _get_max_page_size_for_api_call(service, function, **kwargs) page_key = _get_max_page_size_for_api_call(service, function, **kwargs)
if page_key: if page_key:
kwargs.update(page_key) kwargs.update(page_key)
all_items = [] all_items = []
page_token = None page_token = None
total_items = 0 total_items = 0
while True: while True:
page = call( page = call(service,
service, function,
function, soft_errors=soft_errors,
soft_errors=soft_errors, throw_reasons=throw_reasons,
throw_reasons=throw_reasons, retry_reasons=retry_reasons,
retry_reasons=retry_reasons, pageToken=page_token,
pageToken=page_token, **kwargs)
**kwargs) if page:
if page: page_token = page.get('nextPageToken')
page_token = page.get('nextPageToken') page_items = page.get(items, [])
page_items = page.get(items, []) num_page_items = len(page_items)
num_page_items = len(page_items) total_items += num_page_items
total_items += num_page_items all_items.extend(page_items)
all_items.extend(page_items) else:
else: page_token = None
page_token = None num_page_items = 0
num_page_items = 0
# Show a paging message to the user that indicates paging progress # Show a paging message to the user that indicates paging progress
if page_message: if page_message:
show_message = page_message.replace(TOTAL_ITEMS_MARKER, str(total_items)) show_message = page_message.replace(TOTAL_ITEMS_MARKER,
if message_attribute: str(total_items))
first_item = page_items[0] if num_page_items > 0 else {} if message_attribute:
last_item = page_items[-1] if num_page_items > 1 else first_item first_item = page_items[0] if num_page_items > 0 else {}
show_message = show_message.replace(FIRST_ITEM_MARKER, str(first_item.get(message_attribute, ''))) last_item = page_items[-1] if num_page_items > 1 else first_item
show_message = show_message.replace(LAST_ITEM_MARKER, str(last_item.get(message_attribute, ''))) show_message = show_message.replace(
sys.stderr.write('\r') FIRST_ITEM_MARKER,
sys.stderr.flush() str(first_item.get(message_attribute, '')))
sys.stderr.write(show_message) show_message = show_message.replace(
LAST_ITEM_MARKER, str(last_item.get(message_attribute, '')))
sys.stderr.write('\r')
sys.stderr.flush()
sys.stderr.write(show_message)
if not page_token: if not page_token:
# End the paging status message and return all items. # End the paging status message and return all items.
if page_message and (page_message[-1] != '\n'): if page_message and (page_message[-1] != '\n'):
sys.stderr.write('\r\n') sys.stderr.write('\r\n')
sys.stderr.flush() sys.stderr.flush()
return all_items return all_items
# TODO: Make this private once all execution related items that use this method # TODO: Make this private once all execution related items that use this method
# have been brought into this file # have been brought into this file
def handle_oauth_token_error(e, soft_errors): def handle_oauth_token_error(e, soft_errors):
"""On a token error, exits the application and writes a message to stderr. """On a token error, exits the application and writes a message to stderr.
Args: Args:
e: google.auth.exceptions.RefreshError, The error to handle. e: google.auth.exceptions.RefreshError, The error to handle.
soft_errors: Boolean, if True, suppresses any applicable errors and instead soft_errors: Boolean, if True, suppresses any applicable errors and instead
returns to the caller. returns to the caller.
""" """
token_error = str(e).replace('.', '') token_error = str(e).replace('.', '')
if token_error in errors.OAUTH2_TOKEN_ERRORS or e.startswith( if token_error in errors.OAUTH2_TOKEN_ERRORS or e.startswith(
'Invalid response'): 'Invalid response'):
if soft_errors: if soft_errors:
return return
if not GM_Globals[GM_CURRENT_API_USER]: if not GM_Globals[GM_CURRENT_API_USER]:
display.print_error( display.print_error(
MESSAGE_API_ACCESS_DENIED.format( MESSAGE_API_ACCESS_DENIED.format(
GM_Globals[GM_OAUTH2SERVICE_ACCOUNT_CLIENT_ID], GM_Globals[GM_OAUTH2SERVICE_ACCOUNT_CLIENT_ID],
','.join(GM_Globals[GM_CURRENT_API_SCOPES]))) ','.join(GM_Globals[GM_CURRENT_API_SCOPES])))
controlflow.system_error_exit(12, MESSAGE_API_ACCESS_CONFIG) controlflow.system_error_exit(12, MESSAGE_API_ACCESS_CONFIG)
else: else:
controlflow.system_error_exit( controlflow.system_error_exit(
19, 19,
MESSAGE_SERVICE_NOT_APPLICABLE.format( MESSAGE_SERVICE_NOT_APPLICABLE.format(
GM_Globals[GM_CURRENT_API_USER])) GM_Globals[GM_CURRENT_API_USER]))
controlflow.system_error_exit(18, f'Authentication Token Error - {str(e)}') controlflow.system_error_exit(18, f'Authentication Token Error - {str(e)}')
def get_enum_values_minus_unspecified(values): def get_enum_values_minus_unspecified(values):
return [a_type for a_type in values if '_UNSPECIFIED' not in a_type] return [a_type for a_type in values if '_UNSPECIFIED' not in a_type]

View File

@ -11,7 +11,7 @@ from gam.gapi import errors
def create_http_error(status, reason, message): def create_http_error(status, reason, message):
"""Creates a HttpError object similar to most Google API Errors. """Creates a HttpError object similar to most Google API Errors.
Args: Args:
status: Int, the error's HTTP response status number. status: Int, the error's HTTP response status number.
@ -21,482 +21,499 @@ def create_http_error(status, reason, message):
Returns: Returns:
googleapiclient.errors.HttpError googleapiclient.errors.HttpError
""" """
response = { response = {
'status': status, 'status': status,
'content-type': 'application/json', 'content-type': 'application/json',
} }
content = { content = {
'error': { 'error': {
'code': status, 'code': status,
'errors': [{ 'errors': [{
'reason': str(reason), 'reason': str(reason),
'message': message, 'message': message,
}] }]
} }
} }
content_bytes = json.dumps(content).encode('UTF-8') content_bytes = json.dumps(content).encode('UTF-8')
return gapi.googleapiclient.errors.HttpError(response, content_bytes) return gapi.googleapiclient.errors.HttpError(response, content_bytes)
class GapiTest(unittest.TestCase): class GapiTest(unittest.TestCase):
def setUp(self): def setUp(self):
SetGlobalVariables() SetGlobalVariables()
self.mock_service = MagicMock() self.mock_service = MagicMock()
self.mock_method_name = 'mock_method' self.mock_method_name = 'mock_method'
self.mock_method = getattr(self.mock_service, self.mock_method_name) self.mock_method = getattr(self.mock_service, self.mock_method_name)
self.simple_3_page_response = [ self.simple_3_page_response = [
{ {
'items': [{ 'items': [{
'position': 'page1,item1' 'position': 'page1,item1'
}, { }, {
'position': 'page1,item2' 'position': 'page1,item2'
}, { }, {
'position': 'page1,item3' 'position': 'page1,item3'
}], }],
'nextPageToken': 'page2' 'nextPageToken': 'page2'
}, },
{ {
'items': [{ 'items': [{
'position': 'page2,item1' 'position': 'page2,item1'
}, { }, {
'position': 'page2,item2' 'position': 'page2,item2'
}, { }, {
'position': 'page2,item3' 'position': 'page2,item3'
}], }],
'nextPageToken': 'page3' 'nextPageToken': 'page3'
}, },
{ {
'items': [{ 'items': [{
'position': 'page3,item1' 'position': 'page3,item1'
}, { }, {
'position': 'page3,item2' 'position': 'page3,item2'
}, { }, {
'position': 'page3,item3' 'position': 'page3,item3'
}], }],
}, },
] ]
self.empty_items_response = {'items': []} self.empty_items_response = {'items': []}
super(GapiTest, self).setUp() super(GapiTest, self).setUp()
def test_call_returns_basic_200_response(self): def test_call_returns_basic_200_response(self):
response = gapi.call(self.mock_service, self.mock_method_name) response = gapi.call(self.mock_service, self.mock_method_name)
self.assertEqual(response, self.mock_method().execute.return_value) self.assertEqual(response, self.mock_method().execute.return_value)
def test_call_passes_target_method_params(self): def test_call_passes_target_method_params(self):
gapi.call( gapi.call(self.mock_service,
self.mock_service, self.mock_method_name, my_param_1=1, my_param_2=2) self.mock_method_name,
self.assertEqual(self.mock_method.call_count, 1) my_param_1=1,
method_kwargs = self.mock_method.call_args[1] my_param_2=2)
self.assertEqual(method_kwargs.get('my_param_1'), 1) self.assertEqual(self.mock_method.call_count, 1)
self.assertEqual(method_kwargs.get('my_param_2'), 2) method_kwargs = self.mock_method.call_args[1]
self.assertEqual(method_kwargs.get('my_param_1'), 1)
self.assertEqual(method_kwargs.get('my_param_2'), 2)
@patch.object(gapi.errors, 'get_gapi_error_detail') @patch.object(gapi.errors, 'get_gapi_error_detail')
def test_call_retries_with_soft_errors(self, mock_error_detail): def test_call_retries_with_soft_errors(self, mock_error_detail):
mock_error_detail.return_value = (-1, 'aReason', 'some message') mock_error_detail.return_value = (-1, 'aReason', 'some message')
# Make the request fail first, then return the proper response on the retry. # Make the request fail first, then return the proper response on the retry.
fake_http_error = create_http_error(403, 'aReason', 'unused message') fake_http_error = create_http_error(403, 'aReason', 'unused message')
fake_200_response = MagicMock() fake_200_response = MagicMock()
self.mock_method.return_value.execute.side_effect = [ self.mock_method.return_value.execute.side_effect = [
fake_http_error, fake_200_response fake_http_error, fake_200_response
] ]
response = gapi.call( response = gapi.call(self.mock_service,
self.mock_service, self.mock_method_name, soft_errors=True) self.mock_method_name,
self.assertEqual(response, fake_200_response) soft_errors=True)
self.assertEqual( self.assertEqual(response, fake_200_response)
self.mock_service._http.credentials.refresh.call_count, 1) self.assertEqual(self.mock_service._http.credentials.refresh.call_count,
self.assertEqual(self.mock_method.return_value.execute.call_count, 2) 1)
self.assertEqual(self.mock_method.return_value.execute.call_count, 2)
def test_call_throws_for_provided_reason(self): def test_call_throws_for_provided_reason(self):
throw_reason = errors.ErrorReason.USER_NOT_FOUND throw_reason = errors.ErrorReason.USER_NOT_FOUND
fake_http_error = create_http_error(404, throw_reason, 'forced throw') fake_http_error = create_http_error(404, throw_reason, 'forced throw')
self.mock_method.return_value.execute.side_effect = fake_http_error self.mock_method.return_value.execute.side_effect = fake_http_error
gam_exception = errors.ERROR_REASON_TO_EXCEPTION[throw_reason] gam_exception = errors.ERROR_REASON_TO_EXCEPTION[throw_reason]
with self.assertRaises(gam_exception): with self.assertRaises(gam_exception):
gapi.call( gapi.call(self.mock_service,
self.mock_service, self.mock_method_name,
self.mock_method_name, throw_reasons=[throw_reason])
throw_reasons=[throw_reason])
# Prevent wait_on_failure from performing actual backoff unnecessarily, since # Prevent wait_on_failure from performing actual backoff unnecessarily, since
# we're not actually testing over a network connection # we're not actually testing over a network connection
@patch.object(gapi.controlflow, 'wait_on_failure') @patch.object(gapi.controlflow, 'wait_on_failure')
def test_call_retries_request_for_default_retry_reasons( def test_call_retries_request_for_default_retry_reasons(
self, mock_wait_on_failure): self, mock_wait_on_failure):
# Test using one of the default retry reasons # Test using one of the default retry reasons
default_throw_reason = errors.ErrorReason.BACKEND_ERROR default_throw_reason = errors.ErrorReason.BACKEND_ERROR
self.assertIn(default_throw_reason, errors.DEFAULT_RETRY_REASONS) self.assertIn(default_throw_reason, errors.DEFAULT_RETRY_REASONS)
fake_http_error = create_http_error(404, default_throw_reason, 'message') fake_http_error = create_http_error(404, default_throw_reason,
fake_200_response = MagicMock() 'message')
# Fail once, then succeed on retry fake_200_response = MagicMock()
self.mock_method.return_value.execute.side_effect = [ # Fail once, then succeed on retry
fake_http_error, fake_200_response self.mock_method.return_value.execute.side_effect = [
] fake_http_error, fake_200_response
]
response = gapi.call( response = gapi.call(self.mock_service,
self.mock_service, self.mock_method_name, retry_reasons=[]) self.mock_method_name,
self.assertEqual(response, fake_200_response) retry_reasons=[])
self.assertEqual(self.mock_method.return_value.execute.call_count, 2) self.assertEqual(response, fake_200_response)
# Make sure a backoff technique was used for retry. self.assertEqual(self.mock_method.return_value.execute.call_count, 2)
self.assertEqual(mock_wait_on_failure.call_count, 1) # Make sure a backoff technique was used for retry.
self.assertEqual(mock_wait_on_failure.call_count, 1)
# Prevent wait_on_failure from performing actual backoff unnecessarily, since # Prevent wait_on_failure from performing actual backoff unnecessarily, since
# we're not actually testing over a network connection # we're not actually testing over a network connection
@patch.object(gapi.controlflow, 'wait_on_failure') @patch.object(gapi.controlflow, 'wait_on_failure')
def test_call_retries_requests_for_provided_retry_reasons( def test_call_retries_requests_for_provided_retry_reasons(
self, unused_mock_wait_on_failure): self, unused_mock_wait_on_failure):
retry_reason1 = errors.ErrorReason.INTERNAL_ERROR retry_reason1 = errors.ErrorReason.INTERNAL_ERROR
fake_retrieable_error1 = create_http_error(400, retry_reason1, fake_retrieable_error1 = create_http_error(400, retry_reason1,
'Forced Error 1') 'Forced Error 1')
retry_reason2 = errors.ErrorReason.SYSTEM_ERROR retry_reason2 = errors.ErrorReason.SYSTEM_ERROR
fake_retrieable_error2 = create_http_error(400, retry_reason2, fake_retrieable_error2 = create_http_error(400, retry_reason2,
'Forced Error 2') 'Forced Error 2')
non_retriable_reason = errors.ErrorReason.SERVICE_NOT_AVAILABLE non_retriable_reason = errors.ErrorReason.SERVICE_NOT_AVAILABLE
fake_non_retriable_error = create_http_error( fake_non_retriable_error = create_http_error(
400, non_retriable_reason, 400, non_retriable_reason,
'This error should not cause the request to be retried') 'This error should not cause the request to be retried')
# Fail once, then succeed on retry # Fail once, then succeed on retry
self.mock_method.return_value.execute.side_effect = [ self.mock_method.return_value.execute.side_effect = [
fake_retrieable_error1, fake_retrieable_error2, fake_non_retriable_error fake_retrieable_error1, fake_retrieable_error2,
] fake_non_retriable_error
]
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
# The third call should raise the SystemExit when non_retriable_error is # The third call should raise the SystemExit when non_retriable_error is
# raised. # raised.
gapi.call( gapi.call(self.mock_service,
self.mock_service, self.mock_method_name,
self.mock_method_name, retry_reasons=[retry_reason1, retry_reason2])
retry_reasons=[retry_reason1, retry_reason2])
self.assertEqual(self.mock_method.return_value.execute.call_count, 3) self.assertEqual(self.mock_method.return_value.execute.call_count, 3)
def test_call_exits_on_oauth_token_error(self): def test_call_exits_on_oauth_token_error(self):
# An error with any OAUTH2_TOKEN_ERROR # An error with any OAUTH2_TOKEN_ERROR
fake_token_error = gapi.google.auth.exceptions.RefreshError( fake_token_error = gapi.google.auth.exceptions.RefreshError(
errors.OAUTH2_TOKEN_ERRORS[0]) errors.OAUTH2_TOKEN_ERRORS[0])
self.mock_method.return_value.execute.side_effect = fake_token_error self.mock_method.return_value.execute.side_effect = fake_token_error
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
gapi.call(self.mock_service, self.mock_method_name) gapi.call(self.mock_service, self.mock_method_name)
def test_call_exits_on_nonretriable_error(self): def test_call_exits_on_nonretriable_error(self):
error_reason = 'unknownReason' error_reason = 'unknownReason'
fake_http_error = create_http_error(500, error_reason, fake_http_error = create_http_error(500, error_reason,
'Testing unretriable errors') 'Testing unretriable errors')
self.mock_method.return_value.execute.side_effect = fake_http_error self.mock_method.return_value.execute.side_effect = fake_http_error
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
gapi.call(self.mock_service, self.mock_method_name) gapi.call(self.mock_service, self.mock_method_name)
def test_call_exits_on_request_valueerror(self): def test_call_exits_on_request_valueerror(self):
self.mock_method.return_value.execute.side_effect = ValueError() self.mock_method.return_value.execute.side_effect = ValueError()
with self.assertRaises(SystemExit): with self.assertRaises(SystemExit):
gapi.call(self.mock_service, self.mock_method_name) gapi.call(self.mock_service, self.mock_method_name)
def test_call_clears_bad_http_cache_on_request_failure(self): def test_call_clears_bad_http_cache_on_request_failure(self):
self.mock_service._http.cache = 'something that is not None' self.mock_service._http.cache = 'something that is not None'
fake_200_response = MagicMock() fake_200_response = MagicMock()
self.mock_method.return_value.execute.side_effect = [ self.mock_method.return_value.execute.side_effect = [
ValueError(), fake_200_response ValueError(), fake_200_response
] ]
self.assertIsNotNone(self.mock_service._http.cache) self.assertIsNotNone(self.mock_service._http.cache)
response = gapi.call(self.mock_service, self.mock_method_name) response = gapi.call(self.mock_service, self.mock_method_name)
self.assertEqual(response, fake_200_response) self.assertEqual(response, fake_200_response)
# Assert the cache was cleared # Assert the cache was cleared
self.assertIsNone(self.mock_service._http.cache) self.assertIsNone(self.mock_service._http.cache)
# Prevent wait_on_failure from performing actual backoff unnecessarily, since # Prevent wait_on_failure from performing actual backoff unnecessarily, since
# we're not actually testing over a network connection # we're not actually testing over a network connection
@patch.object(gapi.controlflow, 'wait_on_failure') @patch.object(gapi.controlflow, 'wait_on_failure')
def test_call_retries_requests_with_backoff_on_servernotfounderror( def test_call_retries_requests_with_backoff_on_servernotfounderror(
self, mock_wait_on_failure): self, mock_wait_on_failure):
fake_servernotfounderror = gapi.httplib2.ServerNotFoundError() fake_servernotfounderror = gapi.httplib2.ServerNotFoundError()
fake_200_response = MagicMock() fake_200_response = MagicMock()
# Fail once, then succeed on retry # Fail once, then succeed on retry
self.mock_method.return_value.execute.side_effect = [ self.mock_method.return_value.execute.side_effect = [
fake_servernotfounderror, fake_200_response fake_servernotfounderror, fake_200_response
] ]
http_connections = self.mock_service._http.connections http_connections = self.mock_service._http.connections
response = gapi.call(self.mock_service, self.mock_method_name) response = gapi.call(self.mock_service, self.mock_method_name)
self.assertEqual(response, fake_200_response) self.assertEqual(response, fake_200_response)
# HTTP cached connections should be cleared on receiving this error # HTTP cached connections should be cleared on receiving this error
self.assertNotEqual(http_connections, self.mock_service._http.connections) self.assertNotEqual(http_connections,
self.assertEqual(self.mock_method.return_value.execute.call_count, 2) self.mock_service._http.connections)
# Make sure a backoff technique was used for retry. self.assertEqual(self.mock_method.return_value.execute.call_count, 2)
self.assertEqual(mock_wait_on_failure.call_count, 1) # Make sure a backoff technique was used for retry.
self.assertEqual(mock_wait_on_failure.call_count, 1)
def test_get_items_calls_correct_service_function(self): def test_get_items_calls_correct_service_function(self):
gapi.get_items(self.mock_service, self.mock_method_name) gapi.get_items(self.mock_service, self.mock_method_name)
self.assertTrue(self.mock_method.called) self.assertTrue(self.mock_method.called)
def test_get_items_returns_one_page(self): def test_get_items_returns_one_page(self):
fake_response = {'items': [{}, {}, {}]} fake_response = {'items': [{}, {}, {}]}
self.mock_method.return_value.execute.return_value = fake_response self.mock_method.return_value.execute.return_value = fake_response
page = gapi.get_items(self.mock_service, self.mock_method_name) page = gapi.get_items(self.mock_service, self.mock_method_name)
self.assertEqual(page, fake_response['items']) self.assertEqual(page, fake_response['items'])
def test_get_items_non_default_page_field_name(self): def test_get_items_non_default_page_field_name(self):
field_name = 'things' field_name = 'things'
fake_response = {field_name: [{}, {}, {}]} fake_response = {field_name: [{}, {}, {}]}
self.mock_method.return_value.execute.return_value = fake_response self.mock_method.return_value.execute.return_value = fake_response
page = gapi.get_items( page = gapi.get_items(self.mock_service,
self.mock_service, self.mock_method_name, items=field_name) self.mock_method_name,
self.assertEqual(page, fake_response[field_name]) items=field_name)
self.assertEqual(page, fake_response[field_name])
def test_get_items_passes_additional_kwargs_to_service(self): def test_get_items_passes_additional_kwargs_to_service(self):
gapi.get_items( gapi.get_items(self.mock_service,
self.mock_service, self.mock_method_name, my_param_1=1, my_param_2=2) self.mock_method_name,
self.assertEqual(self.mock_method.call_count, 1) my_param_1=1,
method_kwargs = self.mock_method.call_args[1] my_param_2=2)
self.assertEqual(1, method_kwargs.get('my_param_1')) self.assertEqual(self.mock_method.call_count, 1)
self.assertEqual(2, method_kwargs.get('my_param_2')) method_kwargs = self.mock_method.call_args[1]
self.assertEqual(1, method_kwargs.get('my_param_1'))
self.assertEqual(2, method_kwargs.get('my_param_2'))
def test_get_items_returns_empty_list_when_no_items_returned(self): def test_get_items_returns_empty_list_when_no_items_returned(self):
non_items_response = {'noItemsInThisResponse': {}} non_items_response = {'noItemsInThisResponse': {}}
self.mock_method.return_value.execute.return_value = non_items_response self.mock_method.return_value.execute.return_value = non_items_response
page = gapi.get_items(self.mock_service, self.mock_method_name) page = gapi.get_items(self.mock_service, self.mock_method_name)
self.assertIsInstance(page, list) self.assertIsInstance(page, list)
self.assertEqual(0, len(page)) self.assertEqual(0, len(page))
def test_get_all_pages_returns_all_items(self): def test_get_all_pages_returns_all_items(self):
page_1 = {'items': ['1-1', '1-2', '1-3'], 'nextPageToken': '2'} page_1 = {'items': ['1-1', '1-2', '1-3'], 'nextPageToken': '2'}
page_2 = {'items': ['2-1', '2-2', '2-3'], 'nextPageToken': '3'} page_2 = {'items': ['2-1', '2-2', '2-3'], 'nextPageToken': '3'}
page_3 = {'items': ['3-1', '3-2', '3-3']} page_3 = {'items': ['3-1', '3-2', '3-3']}
self.mock_method.return_value.execute.side_effect = [page_1, page_2, page_3] self.mock_method.return_value.execute.side_effect = [
response_items = gapi.get_all_pages(self.mock_service, page_1, page_2, page_3
self.mock_method_name) ]
self.assertListEqual(response_items, response_items = gapi.get_all_pages(self.mock_service,
page_1['items'] + page_2['items'] + page_3['items']) self.mock_method_name)
self.assertListEqual(
response_items, page_1['items'] + page_2['items'] + page_3['items'])
def test_get_all_pages_includes_next_pagetoken_in_request(self): def test_get_all_pages_includes_next_pagetoken_in_request(self):
page_1 = {'items': ['1-1', '1-2', '1-3'], 'nextPageToken': 'someToken'} page_1 = {'items': ['1-1', '1-2', '1-3'], 'nextPageToken': 'someToken'}
page_2 = {'items': ['2-1', '2-2', '2-3']} page_2 = {'items': ['2-1', '2-2', '2-3']}
self.mock_method.return_value.execute.side_effect = [page_1, page_2] self.mock_method.return_value.execute.side_effect = [page_1, page_2]
gapi.get_all_pages(self.mock_service, self.mock_method_name, pageSize=100) gapi.get_all_pages(self.mock_service,
self.assertEqual(self.mock_method.call_count, 2) self.mock_method_name,
call_2_kwargs = self.mock_method.call_args_list[1][1] pageSize=100)
self.assertIn('pageToken', call_2_kwargs) self.assertEqual(self.mock_method.call_count, 2)
self.assertEqual(call_2_kwargs['pageToken'], page_1['nextPageToken']) call_2_kwargs = self.mock_method.call_args_list[1][1]
self.assertIn('pageToken', call_2_kwargs)
self.assertEqual(call_2_kwargs['pageToken'], page_1['nextPageToken'])
def test_get_all_pages_uses_default_max_page_size(self): def test_get_all_pages_uses_default_max_page_size(self):
sample_api_id = list(gapi.MAX_RESULTS_API_EXCEPTIONS.keys())[0] sample_api_id = list(gapi.MAX_RESULTS_API_EXCEPTIONS.keys())[0]
sample_api_max_results = gapi.MAX_RESULTS_API_EXCEPTIONS[sample_api_id] sample_api_max_results = gapi.MAX_RESULTS_API_EXCEPTIONS[sample_api_id]
self.mock_method.return_value.methodId = sample_api_id self.mock_method.return_value.methodId = sample_api_id
self.mock_service._rootDesc = { self.mock_service._rootDesc = {
'resources': { 'resources': {
'someResource': { 'someResource': {
'methods': { 'methods': {
'someMethod': { 'someMethod': {
'id': sample_api_id, 'id': sample_api_id,
'parameters': { 'parameters': {
'maxResults': { 'maxResults': {
'maximum': sample_api_max_results 'maximum': sample_api_max_results
}
} }
} }
} }
} }
} }
} }
} self.mock_method.return_value.execute.return_value = self.empty_items_response
self.mock_method.return_value.execute.return_value = self.empty_items_response
gapi.get_all_pages(self.mock_service, self.mock_method_name) gapi.get_all_pages(self.mock_service, self.mock_method_name)
request_method_kwargs = self.mock_method.call_args[1] request_method_kwargs = self.mock_method.call_args[1]
self.assertIn('maxResults', request_method_kwargs) self.assertIn('maxResults', request_method_kwargs)
self.assertEqual(request_method_kwargs['maxResults'], self.assertEqual(request_method_kwargs['maxResults'],
gapi.MAX_RESULTS_API_EXCEPTIONS.get(sample_api_id)) gapi.MAX_RESULTS_API_EXCEPTIONS.get(sample_api_id))
def test_get_all_pages_max_page_size_overrided(self): def test_get_all_pages_max_page_size_overrided(self):
self.mock_method.return_value.execute.return_value = self.empty_items_response self.mock_method.return_value.execute.return_value = self.empty_items_response
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, pageSize=123456) self.mock_method_name,
request_method_kwargs = self.mock_method.call_args[1] pageSize=123456)
self.assertIn('pageSize', request_method_kwargs) request_method_kwargs = self.mock_method.call_args[1]
self.assertEqual(123456, request_method_kwargs['pageSize']) self.assertIn('pageSize', request_method_kwargs)
self.assertEqual(123456, request_method_kwargs['pageSize'])
def test_get_all_pages_prints_paging_message(self): def test_get_all_pages_prints_paging_message(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'A simple string displayed during paging' paging_message = 'A simple string displayed during paging'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, page_message=paging_message) self.mock_method_name,
messages_written = [ page_message=paging_message)
call_args[0][0] for call_args in mock_write.call_args_list messages_written = [
] call_args[0][0] for call_args in mock_write.call_args_list
self.assertIn(paging_message, messages_written) ]
self.assertIn(paging_message, messages_written)
def test_get_all_pages_prints_paging_message_inline(self): def test_get_all_pages_prints_paging_message_inline(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'A simple string displayed during paging' paging_message = 'A simple string displayed during paging'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, page_message=paging_message) self.mock_method_name,
messages_written = [ page_message=paging_message)
call_args[0][0] for call_args in mock_write.call_args_list messages_written = [
] call_args[0][0] for call_args in mock_write.call_args_list
]
# Make sure a return carriage was written between two pages # Make sure a return carriage was written between two pages
paging_message_call_positions = [ paging_message_call_positions = [
i for i, message in enumerate(messages_written) i for i, message in enumerate(messages_written)
if message == paging_message if message == paging_message
] ]
self.assertGreater(len(paging_message_call_positions), 1) self.assertGreater(len(paging_message_call_positions), 1)
printed_between_page_messages = messages_written[ printed_between_page_messages = messages_written[
paging_message_call_positions[0]:paging_message_call_positions[1]] paging_message_call_positions[0]:paging_message_call_positions[1]]
self.assertIn('\r', printed_between_page_messages) self.assertIn('\r', printed_between_page_messages)
def test_get_all_pages_ends_paging_message_with_newline(self): def test_get_all_pages_ends_paging_message_with_newline(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'A simple string displayed during paging' paging_message = 'A simple string displayed during paging'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, page_message=paging_message) self.mock_method_name,
messages_written = [ page_message=paging_message)
call_args[0][0] for call_args in mock_write.call_args_list messages_written = [
] call_args[0][0] for call_args in mock_write.call_args_list
last_page_message_index = len( ]
messages_written) - messages_written[::-1].index(paging_message) last_page_message_index = len(
last_carriage_return_index = len( messages_written) - messages_written[::-1].index(paging_message)
messages_written) - messages_written[::-1].index('\r\n') last_carriage_return_index = len(
self.assertGreater(last_carriage_return_index, last_page_message_index) messages_written) - messages_written[::-1].index('\r\n')
self.assertGreater(last_carriage_return_index, last_page_message_index)
def test_get_all_pages_prints_attribute_total_items_in_paging_message(self): def test_get_all_pages_prints_attribute_total_items_in_paging_message(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'Total number of items discovered: %%total_items%%' paging_message = 'Total number of items discovered: %%total_items%%'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, page_message=paging_message) self.mock_method_name,
page_message=paging_message)
messages_written = [ messages_written = [
call_args[0][0] for call_args in mock_write.call_args_list call_args[0][0] for call_args in mock_write.call_args_list
] ]
page_1_item_count = len(self.simple_3_page_response[0]['items']) page_1_item_count = len(self.simple_3_page_response[0]['items'])
page_1_message = paging_message.replace('%%total_items%%', page_1_message = paging_message.replace('%%total_items%%',
str(page_1_item_count)) str(page_1_item_count))
self.assertIn(page_1_message, messages_written) self.assertIn(page_1_message, messages_written)
page_2_item_count = len(self.simple_3_page_response[1]['items']) page_2_item_count = len(self.simple_3_page_response[1]['items'])
page_2_message = paging_message.replace( page_2_message = paging_message.replace(
'%%total_items%%', str(page_1_item_count + page_2_item_count)) '%%total_items%%', str(page_1_item_count + page_2_item_count))
self.assertIn(page_2_message, messages_written) self.assertIn(page_2_message, messages_written)
page_3_item_count = len(self.simple_3_page_response[2]['items']) page_3_item_count = len(self.simple_3_page_response[2]['items'])
page_3_message = paging_message.replace( page_3_message = paging_message.replace(
'%%total_items%%', '%%total_items%%',
str(page_1_item_count + page_2_item_count + page_3_item_count)) str(page_1_item_count + page_2_item_count + page_3_item_count))
self.assertIn(page_3_message, messages_written) self.assertIn(page_3_message, messages_written)
# Assert that the template text is always replaced. # Assert that the template text is always replaced.
for message in messages_written: for message in messages_written:
self.assertNotIn('%%total_items', message) self.assertNotIn('%%total_items', message)
def test_get_all_pages_prints_attribute_first_item_in_paging_message(self): def test_get_all_pages_prints_attribute_first_item_in_paging_message(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'First item in page: %%first_item%%' paging_message = 'First item in page: %%first_item%%'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name,
self.mock_method_name, page_message=paging_message,
page_message=paging_message, message_attribute='position')
message_attribute='position')
messages_written = [ messages_written = [
call_args[0][0] for call_args in mock_write.call_args_list call_args[0][0] for call_args in mock_write.call_args_list
] ]
page_1_message = paging_message.replace( page_1_message = paging_message.replace(
'%%first_item%%', '%%first_item%%',
self.simple_3_page_response[0]['items'][0]['position']) self.simple_3_page_response[0]['items'][0]['position'])
self.assertIn(page_1_message, messages_written) self.assertIn(page_1_message, messages_written)
page_2_message = paging_message.replace( page_2_message = paging_message.replace(
'%%first_item%%', '%%first_item%%',
self.simple_3_page_response[1]['items'][0]['position']) self.simple_3_page_response[1]['items'][0]['position'])
self.assertIn(page_2_message, messages_written) self.assertIn(page_2_message, messages_written)
# Assert that the template text is always replaced. # Assert that the template text is always replaced.
for message in messages_written: for message in messages_written:
self.assertNotIn('%%first_item', message) self.assertNotIn('%%first_item', message)
def test_get_all_pages_prints_attribute_last_item_in_paging_message(self): def test_get_all_pages_prints_attribute_last_item_in_paging_message(self):
self.mock_method.return_value.execute.side_effect = self.simple_3_page_response self.mock_method.return_value.execute.side_effect = self.simple_3_page_response
paging_message = 'Last item in page: %%last_item%%' paging_message = 'Last item in page: %%last_item%%'
with patch.object(gapi.sys.stderr, 'write') as mock_write: with patch.object(gapi.sys.stderr, 'write') as mock_write:
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name,
self.mock_method_name, page_message=paging_message,
page_message=paging_message, message_attribute='position')
message_attribute='position')
messages_written = [ messages_written = [
call_args[0][0] for call_args in mock_write.call_args_list call_args[0][0] for call_args in mock_write.call_args_list
] ]
page_1_message = paging_message.replace( page_1_message = paging_message.replace(
'%%last_item%%', '%%last_item%%',
self.simple_3_page_response[0]['items'][-1]['position']) self.simple_3_page_response[0]['items'][-1]['position'])
self.assertIn(page_1_message, messages_written) self.assertIn(page_1_message, messages_written)
page_2_message = paging_message.replace( page_2_message = paging_message.replace(
'%%last_item%%', '%%last_item%%',
self.simple_3_page_response[1]['items'][-1]['position']) self.simple_3_page_response[1]['items'][-1]['position'])
self.assertIn(page_2_message, messages_written) self.assertIn(page_2_message, messages_written)
# Assert that the template text is always replaced. # Assert that the template text is always replaced.
for message in messages_written: for message in messages_written:
self.assertNotIn('%%last_item', message) self.assertNotIn('%%last_item', message)
def test_get_all_pages_prints_all_attributes_in_paging_message(self): def test_get_all_pages_prints_all_attributes_in_paging_message(self):
pass pass
def test_get_all_pages_passes_additional_kwargs_to_service_method(self): def test_get_all_pages_passes_additional_kwargs_to_service_method(self):
self.mock_method.return_value.execute.return_value = self.empty_items_response self.mock_method.return_value.execute.return_value = self.empty_items_response
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, my_param_1=1, my_param_2=2) self.mock_method_name,
method_kwargs = self.mock_method.call_args[1] my_param_1=1,
self.assertEqual(method_kwargs.get('my_param_1'), 1) my_param_2=2)
self.assertEqual(method_kwargs.get('my_param_2'), 2) method_kwargs = self.mock_method.call_args[1]
self.assertEqual(method_kwargs.get('my_param_1'), 1)
self.assertEqual(method_kwargs.get('my_param_2'), 2)
@patch.object(gapi, 'call') @patch.object(gapi, 'call')
def test_get_all_pages_passes_throw_and_retry_reasons(self, mock_call): def test_get_all_pages_passes_throw_and_retry_reasons(self, mock_call):
throw_for = MagicMock() throw_for = MagicMock()
retry_for = MagicMock() retry_for = MagicMock()
mock_call.return_value = self.empty_items_response mock_call.return_value = self.empty_items_response
gapi.get_all_pages( gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name,
self.mock_method_name, throw_reasons=throw_for,
throw_reasons=throw_for, retry_reasons=retry_for)
retry_reasons=retry_for) method_kwargs = mock_call.call_args[1]
method_kwargs = mock_call.call_args[1] self.assertEqual(method_kwargs.get('throw_reasons'), throw_for)
self.assertEqual(method_kwargs.get('throw_reasons'), throw_for) self.assertEqual(method_kwargs.get('retry_reasons'), retry_for)
self.assertEqual(method_kwargs.get('retry_reasons'), retry_for)
def test_get_all_pages_non_default_items_field_name(self): def test_get_all_pages_non_default_items_field_name(self):
field_name = 'things' field_name = 'things'
fake_response = {field_name: [{}, {}, {}]} fake_response = {field_name: [{}, {}, {}]}
self.mock_method.return_value.execute.return_value = fake_response self.mock_method.return_value.execute.return_value = fake_response
page = gapi.get_all_pages( page = gapi.get_all_pages(self.mock_service,
self.mock_service, self.mock_method_name, items=field_name) self.mock_method_name,
self.assertEqual(page, fake_response[field_name]) items=field_name)
self.assertEqual(page, fake_response[field_name])
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -19,13 +19,12 @@ def normalizeCalendarId(calname, checkPrimary=False):
if not GC_Values[GC_DOMAIN]: if not GC_Values[GC_DOMAIN]:
GC_Values[GC_DOMAIN] = gam._getValueFromOAuth('hd') GC_Values[GC_DOMAIN] = gam._getValueFromOAuth('hd')
return gam.convertUIDtoEmailAddress(calname, return gam.convertUIDtoEmailAddress(calname,
email_types=['user', 'resource']) email_types=['user', 'resource'])
def buildCalendarGAPIObject(calname): def buildCalendarGAPIObject(calname):
calendarId = normalizeCalendarId(calname) calendarId = normalizeCalendarId(calname)
return (calendarId, gam.buildGAPIServiceObject('calendar', return (calendarId, gam.buildGAPIServiceObject('calendar', calendarId))
calendarId))
def buildCalendarDataGAPIObject(calname): def buildCalendarDataGAPIObject(calname):
@ -41,6 +40,7 @@ def buildCalendarDataGAPIObject(calname):
_, cal = buildCalendarGAPIObject(gam._getValueFromOAuth('email')) _, cal = buildCalendarGAPIObject(gam._getValueFromOAuth('email'))
return (calendarId, cal) return (calendarId, cal)
def printShowACLs(csvFormat): def printShowACLs(csvFormat):
calendarId, cal = buildCalendarDataGAPIObject(sys.argv[2]) calendarId, cal = buildCalendarDataGAPIObject(sys.argv[2])
if not cal: if not cal:
@ -54,10 +54,9 @@ def printShowACLs(csvFormat):
i += 1 i += 1
else: else:
action = ['showacl', 'printacl'][csvFormat] action = ['showacl', 'printacl'][csvFormat]
message = f"gam calendar <email> {action}" message = f'gam calendar <email> {action}'
controlflow.invalid_argument_exit(sys.argv[i], message) controlflow.invalid_argument_exit(sys.argv[i], message)
acls = gapi.get_all_pages( acls = gapi.get_all_pages(cal.acl(), 'list', 'items', calendarId=calendarId)
cal.acl(), 'list', 'items', calendarId=calendarId)
i = 0 i = 0
if csvFormat: if csvFormat:
titles = [] titles = []
@ -75,10 +74,11 @@ def printShowACLs(csvFormat):
else: else:
formatted_acl = formatACLRule(rule) formatted_acl = formatACLRule(rule)
current_count = display.current_count(i, count) current_count = display.current_count(i, count)
print(f'Calendar: {calendarId}, ACL: {formatted_acl}{current_count}') print(
f'Calendar: {calendarId}, ACL: {formatted_acl}{current_count}')
if csvFormat: if csvFormat:
display.write_csv_file( display.write_csv_file(rows, titles, f'{calendarId} Calendar ACLs',
rows, titles, f'{calendarId} Calendar ACLs', toDrive) toDrive)
def _getCalendarACLScope(i, body): def _getCalendarACLScope(i, body):
@ -87,8 +87,8 @@ def _getCalendarACLScope(i, body):
body['scope']['type'] = myarg body['scope']['type'] = myarg
i += 1 i += 1
if myarg in ['user', 'group']: if myarg in ['user', 'group']:
body['scope']['value'] = gam.normalizeEmailAddressOrUID( body['scope']['value'] = gam.normalizeEmailAddressOrUID(sys.argv[i],
sys.argv[i], noUid=True) noUid=True)
i += 1 i += 1
elif myarg == 'domain': elif myarg == 'domain':
if i < len(sys.argv) and \ if i < len(sys.argv) and \
@ -99,8 +99,8 @@ def _getCalendarACLScope(i, body):
body['scope']['value'] = GC_Values[GC_DOMAIN] body['scope']['value'] = GC_Values[GC_DOMAIN]
elif myarg != 'default': elif myarg != 'default':
body['scope']['type'] = 'user' body['scope']['type'] = 'user'
body['scope']['value'] = gam.normalizeEmailAddressOrUID( body['scope']['value'] = gam.normalizeEmailAddressOrUID(myarg,
myarg, noUid=True) noUid=True)
return i return i
@ -122,22 +122,26 @@ def addACL(function):
return return
myarg = sys.argv[4].lower().replace('_', '') myarg = sys.argv[4].lower().replace('_', '')
if myarg not in CALENDAR_ACL_ROLES_MAP: if myarg not in CALENDAR_ACL_ROLES_MAP:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('Role',
"Role", ", ".join(CALENDAR_ACL_ROLES_MAP), myarg) ', '.join(CALENDAR_ACL_ROLES_MAP),
myarg)
body = {'role': CALENDAR_ACL_ROLES_MAP[myarg]} body = {'role': CALENDAR_ACL_ROLES_MAP[myarg]}
i = _getCalendarACLScope(5, body) i = _getCalendarACLScope(5, body)
sendNotifications = True sendNotifications = True
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'sendnotifications': if myarg == 'sendnotifications':
sendNotifications = gam.getBoolean(sys.argv[i+1], myarg) sendNotifications = gam.getBoolean(sys.argv[i + 1], myarg)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
sys.argv[i], f"gam calendar <email> {function.lower()}") sys.argv[i], f'gam calendar <email> {function.lower()}')
print(f'Calendar: {calendarId}, {function} ACL: {formatACLRule(body)}') print(f'Calendar: {calendarId}, {function} ACL: {formatACLRule(body)}')
gapi.call(cal.acl(), 'insert', calendarId=calendarId, gapi.call(cal.acl(),
body=body, sendNotifications=sendNotifications) 'insert',
calendarId=calendarId,
body=body,
sendNotifications=sendNotifications)
def delACL(): def delACL():
@ -152,8 +156,11 @@ def delACL():
body = {'role': 'none'} body = {'role': 'none'}
_getCalendarACLScope(5, body) _getCalendarACLScope(5, body)
print(f'Calendar: {calendarId}, Delete ACL: {formatACLScope(body)}') print(f'Calendar: {calendarId}, Delete ACL: {formatACLScope(body)}')
gapi.call(cal.acl(), 'insert', calendarId=calendarId, gapi.call(cal.acl(),
body=body, sendNotifications=False) 'insert',
calendarId=calendarId,
body=body,
sendNotifications=False)
def wipeData(): def wipeData():
@ -176,7 +183,7 @@ def printEvents():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'query': if myarg == 'query':
q = sys.argv[i+1] q = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'includedeleted': elif myarg == 'includedeleted':
showDeleted = True showDeleted = True
@ -185,30 +192,34 @@ def printEvents():
showHiddenInvitations = True showHiddenInvitations = True
i += 1 i += 1
elif myarg == 'after': elif myarg == 'after':
timeMin = utils.get_time_or_delta_from_now(sys.argv[i+1]) timeMin = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'before': elif myarg == 'before':
timeMax = utils.get_time_or_delta_from_now(sys.argv[i+1]) timeMax = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'timezone': elif myarg == 'timezone':
timeZone = sys.argv[i+1] timeZone = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'updated': elif myarg == 'updated':
updatedMin = utils.get_time_or_delta_from_now(sys.argv[i+1]) updatedMin = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'todrive': elif myarg == 'todrive':
toDrive = True toDrive = True
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
sys.argv[i], "gam calendar <email> printevents") sys.argv[i], 'gam calendar <email> printevents')
page_message = gapi.got_total_items_msg(f'Events for {calendarId}', '') page_message = gapi.got_total_items_msg(f'Events for {calendarId}', '')
results = gapi.get_all_pages(cal.events(), 'list', 'items', results = gapi.get_all_pages(cal.events(),
'list',
'items',
page_message=page_message, page_message=page_message,
calendarId=calendarId, q=q, calendarId=calendarId,
q=q,
showDeleted=showDeleted, showDeleted=showDeleted,
showHiddenInvitations=showHiddenInvitations, showHiddenInvitations=showHiddenInvitations,
timeMin=timeMin, timeMax=timeMax, timeMin=timeMin,
timeMax=timeMax,
timeZone=timeZone, timeZone=timeZone,
updatedMin=updatedMin) updatedMin=updatedMin)
for result in results: for result in results:
@ -237,17 +248,19 @@ def getSendUpdates(myarg, i, cal):
sendUpdates = 'all' sendUpdates = 'all'
i += 1 i += 1
elif myarg == 'sendnotifications': elif myarg == 'sendnotifications':
sendUpdates = 'all' if gam.getBoolean(sys.argv[i+1], myarg) else 'none' sendUpdates = 'all' if gam.getBoolean(sys.argv[i +
1], myarg) else 'none'
i += 2 i += 2
else: # 'sendupdates': else: # 'sendupdates':
sendUpdatesMap = {} sendUpdatesMap = {}
for val in cal._rootDesc['resources']['events']['methods']['delete'][ for val in cal._rootDesc['resources']['events']['methods']['delete'][
'parameters']['sendUpdates']['enum']: 'parameters']['sendUpdates']['enum']:
sendUpdatesMap[val.lower()] = val sendUpdatesMap[val.lower()] = val
sendUpdates = sendUpdatesMap.get(sys.argv[i+1].lower(), False) sendUpdates = sendUpdatesMap.get(sys.argv[i + 1].lower(), False)
if not sendUpdates: if not sendUpdates:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('sendupdates',
"sendupdates", ", ".join(sendUpdatesMap), sys.argv[i+1]) ', '.join(sendUpdatesMap),
sys.argv[i + 1])
i += 2 i += 2
return (sendUpdates, i) return (sendUpdates, i)
@ -265,7 +278,7 @@ def moveOrDeleteEvent(moveOrDelete):
if myarg in ['notifyattendees', 'sendnotifications', 'sendupdates']: if myarg in ['notifyattendees', 'sendnotifications', 'sendupdates']:
sendUpdates, i = getSendUpdates(myarg, i, cal) sendUpdates, i = getSendUpdates(myarg, i, cal)
elif myarg in ['id', 'eventid']: elif myarg in ['id', 'eventid']:
eventId = sys.argv[i+1] eventId = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['query', 'eventquery']: elif myarg in ['query', 'eventquery']:
controlflow.system_error_exit( controlflow.system_error_exit(
@ -276,15 +289,19 @@ def moveOrDeleteEvent(moveOrDelete):
doit = True doit = True
i += 1 i += 1
elif moveOrDelete == 'move' and myarg == 'destination': elif moveOrDelete == 'move' and myarg == 'destination':
kwargs['destination'] = sys.argv[i+1] kwargs['destination'] = sys.argv[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
sys.argv[i], f"gam calendar <email> {moveOrDelete}event") sys.argv[i], f'gam calendar <email> {moveOrDelete}event')
if doit: if doit:
print(f' going to {moveOrDelete} eventId {eventId}') print(f' going to {moveOrDelete} eventId {eventId}')
gapi.call(cal.events(), moveOrDelete, calendarId=calendarId, gapi.call(cal.events(),
eventId=eventId, sendUpdates=sendUpdates, **kwargs) moveOrDelete,
calendarId=calendarId,
eventId=eventId,
sendUpdates=sendUpdates,
**kwargs)
else: else:
print( print(
f' would {moveOrDelete} eventId {eventId}. Add doit to command ' \ f' would {moveOrDelete} eventId {eventId}. Add doit to command ' \
@ -296,8 +313,10 @@ def infoEvent():
if not cal: if not cal:
return return
eventId = sys.argv[4] eventId = sys.argv[4]
result = gapi.call(cal.events(), 'get', result = gapi.call(cal.events(),
calendarId=calendarId, eventId=eventId) 'get',
calendarId=calendarId,
eventId=eventId)
display.print_json(result) display.print_json(result)
@ -316,25 +335,36 @@ def addOrUpdateEvent(action):
kwargs = {'eventId': eventId} kwargs = {'eventId': eventId}
i = 5 i = 5
func = 'patch' func = 'patch'
requires_full_update = ['attendee', 'optionalattendee', requires_full_update = [
'removeattendee', 'replacedescription'] 'attendee', 'optionalattendee', 'removeattendee',
'replacedescription'
]
for arg in sys.argv[i:]: for arg in sys.argv[i:]:
if arg.replace('_', '').lower() in requires_full_update: if arg.replace('_', '').lower() in requires_full_update:
func = 'update' func = 'update'
body = gapi.call(cal.events(), 'get', body = gapi.call(cal.events(),
calendarId=calendarId, eventId=eventId) 'get',
calendarId=calendarId,
eventId=eventId)
break break
sendUpdates, body = getEventAttributes(i, calendarId, cal, body, action) sendUpdates, body = getEventAttributes(i, calendarId, cal, body, action)
result = gapi.call(cal.events(), func, conferenceDataVersion=1, result = gapi.call(cal.events(),
supportsAttachments=True, calendarId=calendarId, func,
sendUpdates=sendUpdates, body=body, fields='id', conferenceDataVersion=1,
supportsAttachments=True,
calendarId=calendarId,
sendUpdates=sendUpdates,
body=body,
fields='id',
**kwargs) **kwargs)
print(f'Event {result["id"]} {action} finished') print(f'Event {result["id"]} {action} finished')
def _remove_attendee(attendees, remove_email): def _remove_attendee(attendees, remove_email):
return [attendee for attendee in attendees return [
if not attendee['email'].lower() == remove_email] attendee for attendee in attendees
if not attendee['email'].lower() == remove_email
]
def getEventAttributes(i, calendarId, cal, body, action): def getEventAttributes(i, calendarId, cal, body, action):
@ -348,45 +378,48 @@ def getEventAttributes(i, calendarId, cal, body, action):
sendUpdates, i = getSendUpdates(myarg, i, cal) sendUpdates, i = getSendUpdates(myarg, i, cal)
elif myarg == 'attendee': elif myarg == 'attendee':
body.setdefault('attendees', []) body.setdefault('attendees', [])
body['attendees'].append({'email': sys.argv[i+1]}) body['attendees'].append({'email': sys.argv[i + 1]})
i += 2 i += 2
elif myarg == 'removeattendee' and action == 'update': elif myarg == 'removeattendee' and action == 'update':
remove_email = sys.argv[i+1].lower() remove_email = sys.argv[i + 1].lower()
if 'attendees' in body: if 'attendees' in body:
body['attendees'] = _remove_attendee(body['attendees'], body['attendees'] = _remove_attendee(body['attendees'],
remove_email) remove_email)
i += 2 i += 2
elif myarg == 'optionalattendee': elif myarg == 'optionalattendee':
body.setdefault('attendees', []) body.setdefault('attendees', [])
body['attendees'].append( body['attendees'].append({
{'email': sys.argv[i+1], 'optional': True}) 'email': sys.argv[i + 1],
'optional': True
})
i += 2 i += 2
elif myarg == 'anyonecanaddself': elif myarg == 'anyonecanaddself':
body['anyoneCanAddSelf'] = True body['anyoneCanAddSelf'] = True
i += 1 i += 1
elif myarg == 'description': elif myarg == 'description':
body['description'] = sys.argv[i+1].replace('\\n', '\n') body['description'] = sys.argv[i + 1].replace('\\n', '\n')
i += 2 i += 2
elif myarg == 'replacedescription' and action == 'update': elif myarg == 'replacedescription' and action == 'update':
search = sys.argv[i+1] search = sys.argv[i + 1]
replace = sys.argv[i+2] replace = sys.argv[i + 2]
if 'description' in body: if 'description' in body:
body['description'] = re.sub(search, replace, body['description']) body['description'] = re.sub(search, replace,
body['description'])
i += 3 i += 3
elif myarg == 'start': elif myarg == 'start':
if sys.argv[i+1].lower() == 'allday': if sys.argv[i + 1].lower() == 'allday':
body['start'] = {'date': utils.get_yyyymmdd(sys.argv[i+2])} body['start'] = {'date': utils.get_yyyymmdd(sys.argv[i + 2])}
i += 3 i += 3
else: else:
start_time = utils.get_time_or_delta_from_now(sys.argv[i+1]) start_time = utils.get_time_or_delta_from_now(sys.argv[i + 1])
body['start'] = {'dateTime': start_time} body['start'] = {'dateTime': start_time}
i += 2 i += 2
elif myarg == 'end': elif myarg == 'end':
if sys.argv[i+1].lower() == 'allday': if sys.argv[i + 1].lower() == 'allday':
body['end'] = {'date': utils.get_yyyymmdd(sys.argv[i+2])} body['end'] = {'date': utils.get_yyyymmdd(sys.argv[i + 2])}
i += 3 i += 3
else: else:
end_time = utils.get_time_or_delta_from_now(sys.argv[i+1]) end_time = utils.get_time_or_delta_from_now(sys.argv[i + 1])
body['end'] = {'dateTime': end_time} body['end'] = {'dateTime': end_time}
i += 2 i += 2
elif myarg == 'guestscantinviteothers': elif myarg == 'guestscantinviteothers':
@ -394,64 +427,66 @@ def getEventAttributes(i, calendarId, cal, body, action):
i += 1 i += 1
elif myarg == 'guestscaninviteothers': elif myarg == 'guestscaninviteothers':
body['guestsCanInviteTohters'] = gam.getBoolean( body['guestsCanInviteTohters'] = gam.getBoolean(
sys.argv[i+1], 'guestscaninviteothers') sys.argv[i + 1], 'guestscaninviteothers')
i += 2 i += 2
elif myarg == 'guestscantseeothers': elif myarg == 'guestscantseeothers':
body['guestsCanSeeOtherGuests'] = False body['guestsCanSeeOtherGuests'] = False
i += 1 i += 1
elif myarg == 'guestscanseeothers': elif myarg == 'guestscanseeothers':
body['guestsCanSeeOtherGuests'] = gam.getBoolean( body['guestsCanSeeOtherGuests'] = gam.getBoolean(
sys.argv[i+1], 'guestscanseeothers') sys.argv[i + 1], 'guestscanseeothers')
i += 2 i += 2
elif myarg == 'guestscanmodify': elif myarg == 'guestscanmodify':
body['guestsCanModify'] = gam.getBoolean( body['guestsCanModify'] = gam.getBoolean(sys.argv[i + 1],
sys.argv[i+1], 'guestscanmodify') 'guestscanmodify')
i += 2 i += 2
elif myarg == 'id': elif myarg == 'id':
if action == 'update': if action == 'update':
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
'id', 'gam calendar <calendar> updateevent') 'id', 'gam calendar <calendar> updateevent')
body['id'] = sys.argv[i+1] body['id'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'summary': elif myarg == 'summary':
body['summary'] = sys.argv[i+1] body['summary'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'location': elif myarg == 'location':
body['location'] = sys.argv[i+1] body['location'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'available': elif myarg == 'available':
body['transparency'] = 'transparent' body['transparency'] = 'transparent'
i += 1 i += 1
elif myarg == 'transparency': elif myarg == 'transparency':
validTransparency = ['opaque', 'transparent'] validTransparency = ['opaque', 'transparent']
if sys.argv[i+1].lower() in validTransparency: if sys.argv[i + 1].lower() in validTransparency:
body['transparency'] = sys.argv[i+1].lower() body['transparency'] = sys.argv[i + 1].lower()
else: else:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('transparency',
'transparency', ', '.join(validTransparency),
", ".join(validTransparency), sys.argv[i+1]) sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'visibility': elif myarg == 'visibility':
validVisibility = ['default', 'public', 'private'] validVisibility = ['default', 'public', 'private']
if sys.argv[i+1].lower() in validVisibility: if sys.argv[i + 1].lower() in validVisibility:
body['visibility'] = sys.argv[i+1].lower() body['visibility'] = sys.argv[i + 1].lower()
else: else:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('visibility',
"visibility", ", ".join(validVisibility), sys.argv[i+1]) ', '.join(validVisibility),
sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'tentative': elif myarg == 'tentative':
body['status'] = 'tentative' body['status'] = 'tentative'
i += 1 i += 1
elif myarg == 'status': elif myarg == 'status':
validStatus = ['confirmed', 'tentative', 'cancelled'] validStatus = ['confirmed', 'tentative', 'cancelled']
if sys.argv[i+1].lower() in validStatus: if sys.argv[i + 1].lower() in validStatus:
body['status'] = sys.argv[i+1].lower() body['status'] = sys.argv[i + 1].lower()
else: else:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('visibility',
'visibility', ', '.join(validStatus), sys.argv[i+1]) ', '.join(validStatus),
sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'source': elif myarg == 'source':
body['source'] = {'title': sys.argv[i+1], 'url': sys.argv[i+2]} body['source'] = {'title': sys.argv[i + 1], 'url': sys.argv[i + 2]}
i += 3 i += 3
elif myarg == 'noreminders': elif myarg == 'noreminders':
body['reminders'] = {'useDefault': False} body['reminders'] = {'useDefault': False}
@ -460,43 +495,48 @@ def getEventAttributes(i, calendarId, cal, body, action):
minutes = \ minutes = \
gam.getInteger(sys.argv[i+1], myarg, minVal=0, gam.getInteger(sys.argv[i+1], myarg, minVal=0,
maxVal=CALENDAR_REMINDER_MAX_MINUTES) maxVal=CALENDAR_REMINDER_MAX_MINUTES)
reminder = {'minutes': minutes, 'method': sys.argv[i+2]} reminder = {'minutes': minutes, 'method': sys.argv[i + 2]}
body.setdefault( body.setdefault('reminders', {'overrides': [], 'useDefault': False})
'reminders', {'overrides': [], 'useDefault': False})
body['reminders']['overrides'].append(reminder) body['reminders']['overrides'].append(reminder)
i += 3 i += 3
elif myarg == 'recurrence': elif myarg == 'recurrence':
body.setdefault('recurrence', []) body.setdefault('recurrence', [])
body['recurrence'].append(sys.argv[i+1]) body['recurrence'].append(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'timezone': elif myarg == 'timezone':
timeZone = sys.argv[i+1] timeZone = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'privateproperty': elif myarg == 'privateproperty':
if 'extendedProperties' not in body: if 'extendedProperties' not in body:
body['extendedProperties'] = {'private': {}, 'shared': {}} body['extendedProperties'] = {'private': {}, 'shared': {}}
body['extendedProperties']['private'][sys.argv[i+1]] = sys.argv[i+2] body['extendedProperties']['private'][sys.argv[i +
1]] = sys.argv[i + 2]
i += 3 i += 3
elif myarg == 'sharedproperty': elif myarg == 'sharedproperty':
if 'extendedProperties' not in body: if 'extendedProperties' not in body:
body['extendedProperties'] = {'private': {}, 'shared': {}} body['extendedProperties'] = {'private': {}, 'shared': {}}
body['extendedProperties']['shared'][sys.argv[i+1]] = sys.argv[i+2] body['extendedProperties']['shared'][sys.argv[i + 1]] = sys.argv[i +
2]
i += 3 i += 3
elif myarg == 'colorindex': elif myarg == 'colorindex':
body['colorId'] = gam.getInteger( body['colorId'] = gam.getInteger(sys.argv[i + 1], myarg,
sys.argv[i+1], myarg, CALENDAR_EVENT_MIN_COLOR_INDEX, CALENDAR_EVENT_MIN_COLOR_INDEX,
CALENDAR_EVENT_MAX_COLOR_INDEX) CALENDAR_EVENT_MAX_COLOR_INDEX)
i += 2 i += 2
elif myarg == 'hangoutsmeet': elif myarg == 'hangoutsmeet':
body['conferenceData'] = {'createRequest': { body['conferenceData'] = {
'requestId': f'{str(uuid.uuid4())}'}} 'createRequest': {
'requestId': f'{str(uuid.uuid4())}'
}
}
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
sys.argv[i], f'gam calendar <email> {action}event') sys.argv[i], f'gam calendar <email> {action}event')
if ('recurrence' in body) and (('start' in body) or ('end' in body)): if ('recurrence' in body) and (('start' in body) or ('end' in body)):
if not timeZone: if not timeZone:
timeZone = gapi.call(cal.calendars(), 'get', timeZone = gapi.call(cal.calendars(),
'get',
calendarId=calendarId, calendarId=calendarId,
fields='timeZone')['timeZone'] fields='timeZone')['timeZone']
if 'start' in body: if 'start' in body:
@ -515,20 +555,20 @@ def modifySettings():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'description': if myarg == 'description':
body['description'] = sys.argv[i+1] body['description'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'location': elif myarg == 'location':
body['location'] = sys.argv[i+1] body['location'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'summary': elif myarg == 'summary':
body['summary'] = sys.argv[i+1] body['summary'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'timezone': elif myarg == 'timezone':
body['timeZone'] = sys.argv[i+1] body['timeZone'] = sys.argv[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam calendar <email> modify") 'gam calendar <email> modify')
gapi.call(cal.calendars(), 'patch', calendarId=calendarId, body=body) gapi.call(cal.calendars(), 'patch', calendarId=calendarId, body=body)
@ -540,23 +580,23 @@ def changeAttendees(users):
while len(sys.argv) > i: while len(sys.argv) > i:
myarg = sys.argv[i].lower() myarg = sys.argv[i].lower()
if myarg == 'csv': if myarg == 'csv':
csv_file = sys.argv[i+1] csv_file = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'dryrun': elif myarg == 'dryrun':
do_it = False do_it = False
i += 1 i += 1
elif myarg == 'start': elif myarg == 'start':
start_date = utils.get_time_or_delta_from_now(sys.argv[i+1]) start_date = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'end': elif myarg == 'end':
end_date = utils.get_time_or_delta_from_now(sys.argv[i+1]) end_date = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'allevents': elif myarg == 'allevents':
allevents = True allevents = True
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(
sys.argv[i], "gam <users> update calattendees") sys.argv[i], 'gam <users> update calattendees')
attendee_map = {} attendee_map = {}
f = fileutils.open_file(csv_file) f = fileutils.open_file(csv_file)
csvFile = csv.reader(f) csvFile = csv.reader(f)
@ -570,9 +610,13 @@ def changeAttendees(users):
continue continue
page_token = None page_token = None
while True: while True:
events_page = gapi.call(cal.events(), 'list', calendarId=user, events_page = gapi.call(cal.events(),
pageToken=page_token, timeMin=start_date, 'list',
timeMax=end_date, showDeleted=False, calendarId=user,
pageToken=page_token,
timeMin=start_date,
timeMax=end_date,
showDeleted=False,
showHiddenInvitations=False) showHiddenInvitations=False)
print(f'Got {len(events_page.get("items", []))}') print(f'Got {len(events_page.get("items", []))}')
for event in events_page.get('items', []): for event in events_page.get('items', []):
@ -596,8 +640,8 @@ def changeAttendees(users):
try: try:
if attendee['email'].lower() in attendee_map: if attendee['email'].lower() in attendee_map:
old_email = attendee['email'].lower() old_email = attendee['email'].lower()
new_email = attendee_map[attendee['email'].lower( new_email = attendee_map[
)] attendee['email'].lower()]
print(f' SWITCHING attendee {old_email} to ' \ print(f' SWITCHING attendee {old_email} to ' \
f'{new_email} for {event_summary}') f'{new_email} for {event_summary}')
event['attendees'].remove(attendee) event['attendees'].remove(attendee)
@ -612,9 +656,12 @@ def changeAttendees(users):
body['attendees'] = event['attendees'] body['attendees'] = event['attendees']
print(f'UPDATING {event_summary}') print(f'UPDATING {event_summary}')
if do_it: if do_it:
gapi.call(cal.events(), 'patch', calendarId=user, gapi.call(cal.events(),
'patch',
calendarId=user,
eventId=event['id'], eventId=event['id'],
sendNotifications=False, body=body) sendNotifications=False,
body=body)
else: else:
print(' not pulling the trigger.') print(' not pulling the trigger.')
# else: # else:
@ -631,8 +678,10 @@ def deleteCalendar(users):
user, cal = buildCalendarGAPIObject(user) user, cal = buildCalendarGAPIObject(user)
if not cal: if not cal:
continue continue
gapi.call(cal.calendarList(), 'delete', gapi.call(cal.calendarList(),
soft_errors=True, calendarId=calendarId) 'delete',
soft_errors=True,
calendarId=calendarId)
CALENDAR_REMINDER_MAX_MINUTES = 40320 CALENDAR_REMINDER_MAX_MINUTES = 40320
@ -649,62 +698,71 @@ def getCalendarAttributes(i, body, function):
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'selected': if myarg == 'selected':
body['selected'] = gam.getBoolean(sys.argv[i+1], myarg) body['selected'] = gam.getBoolean(sys.argv[i + 1], myarg)
i += 2 i += 2
elif myarg == 'hidden': elif myarg == 'hidden':
body['hidden'] = gam.getBoolean(sys.argv[i+1], myarg) body['hidden'] = gam.getBoolean(sys.argv[i + 1], myarg)
i += 2 i += 2
elif myarg == 'summary': elif myarg == 'summary':
body['summaryOverride'] = sys.argv[i+1] body['summaryOverride'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'colorindex': elif myarg == 'colorindex':
body['colorId'] = gam.getInteger( body['colorId'] = gam.getInteger(sys.argv[i + 1],
sys.argv[i+1], myarg, minVal=CALENDAR_MIN_COLOR_INDEX, myarg,
maxVal=CALENDAR_MAX_COLOR_INDEX) minVal=CALENDAR_MIN_COLOR_INDEX,
maxVal=CALENDAR_MAX_COLOR_INDEX)
i += 2 i += 2
elif myarg == 'backgroundcolor': elif myarg == 'backgroundcolor':
body['backgroundColor'] = gam.getColor(sys.argv[i+1]) body['backgroundColor'] = gam.getColor(sys.argv[i + 1])
colorRgbFormat = True colorRgbFormat = True
i += 2 i += 2
elif myarg == 'foregroundcolor': elif myarg == 'foregroundcolor':
body['foregroundColor'] = gam.getColor(sys.argv[i+1]) body['foregroundColor'] = gam.getColor(sys.argv[i + 1])
colorRgbFormat = True colorRgbFormat = True
i += 2 i += 2
elif myarg == 'reminder': elif myarg == 'reminder':
body.setdefault('defaultReminders', []) body.setdefault('defaultReminders', [])
method = sys.argv[i+1].lower() method = sys.argv[i + 1].lower()
if method not in CLEAR_NONE_ARGUMENT: if method not in CLEAR_NONE_ARGUMENT:
if method not in CALENDAR_REMINDER_METHODS: if method not in CALENDAR_REMINDER_METHODS:
controlflow.expected_argument_exit("Method", ", ".join( controlflow.expected_argument_exit(
CALENDAR_REMINDER_METHODS+CLEAR_NONE_ARGUMENT), method) 'Method', ', '.join(CALENDAR_REMINDER_METHODS +
minutes = gam.getInteger( CLEAR_NONE_ARGUMENT), method)
sys.argv[i+2], myarg, minVal=0, minutes = gam.getInteger(sys.argv[i + 2],
maxVal=CALENDAR_REMINDER_MAX_MINUTES) myarg,
body['defaultReminders'].append( minVal=0,
{'method': method, 'minutes': minutes}) maxVal=CALENDAR_REMINDER_MAX_MINUTES)
body['defaultReminders'].append({
'method': method,
'minutes': minutes
})
i += 3 i += 3
else: else:
i += 2 i += 2
elif myarg == 'notification': elif myarg == 'notification':
body.setdefault('notificationSettings', {'notifications': []}) body.setdefault('notificationSettings', {'notifications': []})
method = sys.argv[i+1].lower() method = sys.argv[i + 1].lower()
if method not in CLEAR_NONE_ARGUMENT: if method not in CLEAR_NONE_ARGUMENT:
if method not in CALENDAR_NOTIFICATION_METHODS: if method not in CALENDAR_NOTIFICATION_METHODS:
controlflow.expected_argument_exit("Method", ", ".join( controlflow.expected_argument_exit(
CALENDAR_NOTIFICATION_METHODS+CLEAR_NONE_ARGUMENT), method) 'Method', ', '.join(CALENDAR_NOTIFICATION_METHODS +
eventType = sys.argv[i+2].lower() CLEAR_NONE_ARGUMENT), method)
eventType = sys.argv[i + 2].lower()
if eventType not in CALENDAR_NOTIFICATION_TYPES_MAP: if eventType not in CALENDAR_NOTIFICATION_TYPES_MAP:
controlflow.expected_argument_exit("Event", ", ".join( controlflow.expected_argument_exit(
CALENDAR_NOTIFICATION_TYPES_MAP), eventType) 'Event', ', '.join(CALENDAR_NOTIFICATION_TYPES_MAP),
notice = {'method': method, eventType)
'type': CALENDAR_NOTIFICATION_TYPES_MAP[eventType]} notice = {
'method': method,
'type': CALENDAR_NOTIFICATION_TYPES_MAP[eventType]
}
body['notificationSettings']['notifications'].append(notice) body['notificationSettings']['notifications'].append(notice)
i += 3 i += 3
else: else:
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], f"gam {function} calendar") f'gam {function} calendar')
return colorRgbFormat return colorRgbFormat
@ -721,8 +779,11 @@ def addCalendar(users):
continue continue
current_count = display.current_count(i, count) current_count = display.current_count(i, count)
print(f'Subscribing {user} to calendar {calendarId}{current_count}') print(f'Subscribing {user} to calendar {calendarId}{current_count}')
gapi.call(cal.calendarList(), 'insert', soft_errors=True, gapi.call(cal.calendarList(),
body=body, colorRgbFormat=colorRgbFormat) 'insert',
soft_errors=True,
body=body,
colorRgbFormat=colorRgbFormat)
def updateCalendar(users): def updateCalendar(users):
@ -740,13 +801,17 @@ def updateCalendar(users):
print(f"Updating {user}'s subscription to calendar ' \ print(f"Updating {user}'s subscription to calendar ' \
f'{calendarId}{current_count}") f'{calendarId}{current_count}")
calId = calendarId if calendarId != 'primary' else user calId = calendarId if calendarId != 'primary' else user
gapi.call(cal.calendarList(), 'patch', soft_errors=True, gapi.call(cal.calendarList(),
calendarId=calId, body=body, colorRgbFormat=colorRgbFormat) 'patch',
soft_errors=True,
calendarId=calId,
body=body,
colorRgbFormat=colorRgbFormat)
def _showCalendar(userCalendar, j, jcount): def _showCalendar(userCalendar, j, jcount):
current_count = display.current_count(j, jcount) current_count = display.current_count(j, jcount)
summary = userCalendar.get("summaryOverride", userCalendar["summary"]) summary = userCalendar.get('summaryOverride', userCalendar['summary'])
print(f' Calendar: {userCalendar["id"]}{current_count}') print(f' Calendar: {userCalendar["id"]}{current_count}')
print(f' Summary: {summary}') print(f' Summary: {summary}')
print(f' Description: {userCalendar.get("description", "")}') print(f' Description: {userCalendar.get("description", "")}')
@ -780,7 +845,8 @@ def infoCalendar(users):
user, cal = buildCalendarGAPIObject(user) user, cal = buildCalendarGAPIObject(user)
if not cal: if not cal:
continue continue
result = gapi.call(cal.calendarList(), 'get', result = gapi.call(cal.calendarList(),
'get',
soft_errors=True, soft_errors=True,
calendarId=calendarId) calendarId=calendarId)
if result: if result:
@ -809,8 +875,10 @@ def printShowCalendars(users, csvFormat):
user, cal = buildCalendarGAPIObject(user) user, cal = buildCalendarGAPIObject(user)
if not cal: if not cal:
continue continue
result = gapi.get_all_pages( result = gapi.get_all_pages(cal.calendarList(),
cal.calendarList(), 'list', 'items', soft_errors=True) 'list',
'items',
soft_errors=True)
jcount = len(result) jcount = len(result)
if not csvFormat: if not csvFormat:
print(f'User: {user}, Calendars:{display.current_count(i, count)}') print(f'User: {user}, Calendars:{display.current_count(i, count)}')
@ -825,8 +893,9 @@ def printShowCalendars(users, csvFormat):
continue continue
for userCalendar in result: for userCalendar in result:
row = {'primaryEmail': user} row = {'primaryEmail': user}
display.add_row_titles_to_csv_file(utils.flatten_json( display.add_row_titles_to_csv_file(
userCalendar, flattened=row), csvRows, titles) utils.flatten_json(userCalendar, flattened=row), csvRows,
titles)
if csvFormat: if csvFormat:
display.sort_csv_titles(['primaryEmail', 'id'], titles) display.sort_csv_titles(['primaryEmail', 'id'], titles)
display.write_csv_file(csvRows, titles, 'Calendars', todrive) display.write_csv_file(csvRows, titles, 'Calendars', todrive)
@ -840,8 +909,10 @@ def showCalSettings(users):
user, cal = buildCalendarGAPIObject(user) user, cal = buildCalendarGAPIObject(user)
if not cal: if not cal:
continue continue
feed = gapi.get_all_pages( feed = gapi.get_all_pages(cal.settings(),
cal.settings(), 'list', 'items', soft_errors=True) 'list',
'items',
soft_errors=True)
if feed: if feed:
current_count = display.current_count(i, count) current_count = display.current_count(i, count)
print(f'User: {user}, Calendar Settings:{current_count}') print(f'User: {user}, Calendar Settings:{current_count}')
@ -862,11 +933,11 @@ def transferSecCals(users):
remove_source_user = False remove_source_user = False
i += 1 i += 1
elif myarg == 'sendnotifications': elif myarg == 'sendnotifications':
sendNotifications = gam.getBoolean(sys.argv[i+1], myarg) sendNotifications = gam.getBoolean(sys.argv[i + 1], myarg)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam <users> transfer seccals") 'gam <users> transfer seccals')
if remove_source_user: if remove_source_user:
target_user, target_cal = buildCalendarGAPIObject(target_user) target_user, target_cal = buildCalendarGAPIObject(target_user)
if not target_cal: if not target_cal:
@ -875,20 +946,38 @@ def transferSecCals(users):
user, source_cal = buildCalendarGAPIObject(user) user, source_cal = buildCalendarGAPIObject(user)
if not source_cal: if not source_cal:
continue continue
calendars = gapi.get_all_pages(source_cal.calendarList(), 'list', calendars = gapi.get_all_pages(source_cal.calendarList(),
'items', soft_errors=True, 'list',
minAccessRole='owner', showHidden=True, 'items',
soft_errors=True,
minAccessRole='owner',
showHidden=True,
fields='items(id),nextPageToken') fields='items(id),nextPageToken')
for calendar in calendars: for calendar in calendars:
calendarId = calendar['id'] calendarId = calendar['id']
if calendarId.find('@group.calendar.google.com') != -1: if calendarId.find('@group.calendar.google.com') != -1:
body = {'role': 'owner', body = {
'scope': {'type': 'user', 'value': target_user}} 'role': 'owner',
gapi.call(source_cal.acl(), 'insert', calendarId=calendarId, 'scope': {
body=body, sendNotifications=sendNotifications) 'type': 'user',
'value': target_user
}
}
gapi.call(source_cal.acl(),
'insert',
calendarId=calendarId,
body=body,
sendNotifications=sendNotifications)
if remove_source_user: if remove_source_user:
body = {'role': 'none', body = {
'scope': {'type': 'user', 'value': user}} 'role': 'none',
gapi.call(target_cal.acl(), 'insert', 'scope': {
calendarId=calendarId, body=body, 'type': 'user',
'value': user
}
}
gapi.call(target_cal.acl(),
'insert',
calendarId=calendarId,
body=body,
sendNotifications=sendNotifications) sendNotifications=sendNotifications)

View File

@ -20,29 +20,33 @@ def doUpdateCros():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'user': if myarg == 'user':
update_body['annotatedUser'] = sys.argv[i+1] update_body['annotatedUser'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'location': elif myarg == 'location':
update_body['annotatedLocation'] = sys.argv[i+1] update_body['annotatedLocation'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'notes': elif myarg == 'notes':
update_body['notes'] = sys.argv[i+1].replace('\\n', '\n') update_body['notes'] = sys.argv[i + 1].replace('\\n', '\n')
i += 2 i += 2
elif myarg in ['tag', 'asset', 'assetid']: elif myarg in ['tag', 'asset', 'assetid']:
update_body['annotatedAssetId'] = sys.argv[i+1] update_body['annotatedAssetId'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['ou', 'org']: elif myarg in ['ou', 'org']:
orgUnitPath = gam.getOrgUnitItem(sys.argv[i+1]) orgUnitPath = gam.getOrgUnitItem(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'action': elif myarg == 'action':
action = sys.argv[i+1].lower().replace('_', '').replace('-', '') action = sys.argv[i + 1].lower().replace('_', '').replace('-', '')
deprovisionReason = None deprovisionReason = None
if action in ['deprovisionsamemodelreplace', if action in [
'deprovisionsamemodelreplacement']: 'deprovisionsamemodelreplace',
'deprovisionsamemodelreplacement'
]:
action = 'deprovision' action = 'deprovision'
deprovisionReason = 'same_model_replacement' deprovisionReason = 'same_model_replacement'
elif action in ['deprovisiondifferentmodelreplace', elif action in [
'deprovisiondifferentmodelreplacement']: 'deprovisiondifferentmodelreplace',
'deprovisiondifferentmodelreplacement'
]:
action = 'deprovision' action = 'deprovision'
deprovisionReason = 'different_model_replacement' deprovisionReason = 'different_model_replacement'
elif action in ['deprovisionretiringdevice']: elif action in ['deprovisionretiringdevice']:
@ -62,7 +66,7 @@ def doUpdateCros():
ack_wipe = True ack_wipe = True
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam update cros") controlflow.invalid_argument_exit(sys.argv[i], 'gam update cros')
i = 0 i = 0
count = len(devices) count = len(devices)
if action_body: if action_body:
@ -86,27 +90,33 @@ def doUpdateCros():
i += 1 i += 1
cur_count = gam.currentCount(i, count) cur_count = gam.currentCount(i, count)
print(f' performing action {action} for {deviceId}{cur_count}') print(f' performing action {action} for {deviceId}{cur_count}')
gapi.call(cd.chromeosdevices(), function='action', gapi.call(cd.chromeosdevices(),
function='action',
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
resourceId=deviceId, body=action_body) resourceId=deviceId,
body=action_body)
else: else:
if update_body: if update_body:
for deviceId in devices: for deviceId in devices:
i += 1 i += 1
current_count = gam.currentCount(i, count) current_count = gam.currentCount(i, count)
print(f' updating {deviceId}{current_count}') print(f' updating {deviceId}{current_count}')
gapi.call(cd.chromeosdevices(), 'update', gapi.call(cd.chromeosdevices(),
'update',
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
deviceId=deviceId, body=update_body) deviceId=deviceId,
body=update_body)
if orgUnitPath: if orgUnitPath:
# split moves into max 50 devices per batch # split moves into max 50 devices per batch
for l in range(0, len(devices), 50): for l in range(0, len(devices), 50):
move_body = {'deviceIds': devices[l:l+50]} move_body = {'deviceIds': devices[l:l + 50]}
print(f' moving {len(move_body["deviceIds"])} devices to ' \ print(f' moving {len(move_body["deviceIds"])} devices to ' \
f'{orgUnitPath}') f'{orgUnitPath}')
gapi.call(cd.chromeosdevices(), 'moveDevicesToOu', gapi.call(cd.chromeosdevices(),
'moveDevicesToOu',
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
orgUnitPath=orgUnitPath, body=move_body) orgUnitPath=orgUnitPath,
body=move_body)
def doGetCrosInfo(): def doGetCrosInfo():
@ -125,13 +135,13 @@ def doGetCrosInfo():
noLists = True noLists = True
i += 1 i += 1
elif myarg == 'listlimit': elif myarg == 'listlimit':
listLimit = gam.getInteger(sys.argv[i+1], myarg, minVal=-1) listLimit = gam.getInteger(sys.argv[i + 1], myarg, minVal=-1)
i += 2 i += 2
elif myarg in CROS_START_ARGUMENTS: elif myarg in CROS_START_ARGUMENTS:
startDate = _getFilterDate(sys.argv[i+1]) startDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg in CROS_END_ARGUMENTS: elif myarg in CROS_END_ARGUMENTS:
endDate = _getFilterDate(sys.argv[i+1]) endDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'allfields': elif myarg == 'allfields':
projection = 'FULL' projection = 'FULL'
@ -148,7 +158,7 @@ def doGetCrosInfo():
fieldsList.extend(CROS_ARGUMENT_TO_PROPERTY_MAP[myarg]) fieldsList.extend(CROS_ARGUMENT_TO_PROPERTY_MAP[myarg])
i += 1 i += 1
elif myarg == 'fields': elif myarg == 'fields':
fieldNameList = sys.argv[i+1] fieldNameList = sys.argv[i + 1]
for field in fieldNameList.lower().replace(',', ' ').split(): for field in fieldNameList.lower().replace(',', ' ').split():
if field in CROS_ARGUMENT_TO_PROPERTY_MAP: if field in CROS_ARGUMENT_TO_PROPERTY_MAP:
fieldsList.extend(CROS_ARGUMENT_TO_PROPERTY_MAP[field]) fieldsList.extend(CROS_ARGUMENT_TO_PROPERTY_MAP[field])
@ -158,21 +168,21 @@ def doGetCrosInfo():
projection = 'FULL' projection = 'FULL'
noLists = False noLists = False
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(field,
field, "gam info cros fields") 'gam info cros fields')
i += 2 i += 2
elif myarg == 'downloadfile': elif myarg == 'downloadfile':
downloadfile = sys.argv[i+1] downloadfile = sys.argv[i + 1]
if downloadfile.lower() == 'latest': if downloadfile.lower() == 'latest':
downloadfile = downloadfile.lower() downloadfile = downloadfile.lower()
i += 2 i += 2
elif myarg == 'targetfolder': elif myarg == 'targetfolder':
targetFolder = os.path.expanduser(sys.argv[i+1]) targetFolder = os.path.expanduser(sys.argv[i + 1])
if not os.path.isdir(targetFolder): if not os.path.isdir(targetFolder):
os.makedirs(targetFolder) os.makedirs(targetFolder)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam info cros") controlflow.invalid_argument_exit(sys.argv[i], 'gam info cros')
if fieldsList: if fieldsList:
fieldsList.append('deviceId') fieldsList.append('deviceId')
fields = ','.join(set(fieldsList)).replace('.', '/') fields = ','.join(set(fieldsList)).replace('.', '/')
@ -182,9 +192,11 @@ def doGetCrosInfo():
device_count = len(devices) device_count = len(devices)
for deviceId in devices: for deviceId in devices:
i += 1 i += 1
cros = gapi.call(cd.chromeosdevices(), 'get', cros = gapi.call(cd.chromeosdevices(),
'get',
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
deviceId=deviceId, projection=projection, deviceId=deviceId,
projection=projection,
fields=fields) fields=fields)
print(f'CrOS Device: {deviceId} ({i} of {device_count})') print(f'CrOS Device: {deviceId} ({i} of {device_count})')
if 'notes' in cros: if 'notes' in cros:
@ -208,8 +220,8 @@ def doGetCrosInfo():
print(' activeTimeRanges') print(' activeTimeRanges')
num_ranges = min(lenATR, listLimit or lenATR) num_ranges = min(lenATR, listLimit or lenATR)
for activeTimeRange in activeTimeRanges[:num_ranges]: for activeTimeRange in activeTimeRanges[:num_ranges]:
active_date = activeTimeRange["date"] active_date = activeTimeRange['date']
active_time = activeTimeRange["activeTime"] active_time = activeTimeRange['activeTime']
duration = utils.formatMilliSeconds(active_time) duration = utils.formatMilliSeconds(active_time)
minutes = active_time // 60000 minutes = active_time // 60000
print(f' date: {active_date}') print(f' date: {active_date}')
@ -222,16 +234,17 @@ def doGetCrosInfo():
print(' recentUsers') print(' recentUsers')
num_ranges = min(lenRU, listLimit or lenRU) num_ranges = min(lenRU, listLimit or lenRU)
for recentUser in recentUsers[:num_ranges]: for recentUser in recentUsers[:num_ranges]:
useremail = recentUser.get("email") useremail = recentUser.get('email')
if not useremail: if not useremail:
if recentUser["type"] == "USER_TYPE_UNMANAGED": if recentUser['type'] == 'USER_TYPE_UNMANAGED':
useremail = 'UnmanagedUser' useremail = 'UnmanagedUser'
else: else:
useremail = 'Unknown' useremail = 'Unknown'
print(f' type: {recentUser["type"]}') print(f' type: {recentUser["type"]}')
print(f' email: {useremail}') print(f' email: {useremail}')
deviceFiles = _filterCreateReportTime( deviceFiles = _filterCreateReportTime(cros.get('deviceFiles',
cros.get('deviceFiles', []), 'createTime', startDate, endDate) []), 'createTime',
startDate, endDate)
lenDF = len(deviceFiles) lenDF = len(deviceFiles)
if lenDF: if lenDF:
num_ranges = min(lenDF, listLimit or lenDF) num_ranges = min(lenDF, listLimit or lenDF)
@ -255,22 +268,21 @@ def doGetCrosInfo():
f'available to download.') f'available to download.')
deviceFile = None deviceFile = None
if deviceFile: if deviceFile:
created = deviceFile["createTime"] created = deviceFile['createTime']
downloadfile = f'cros-logs-{deviceId}-{created}.zip' downloadfile = f'cros-logs-{deviceId}-{created}.zip'
downloadfilename = os.path.join(targetFolder, downloadfilename = os.path.join(targetFolder,
downloadfile) downloadfile)
dl_url = deviceFile['downloadUrl'] dl_url = deviceFile['downloadUrl']
_, content = cd._http.request(dl_url) _, content = cd._http.request(dl_url)
fileutils.write_file(downloadfilename, content, fileutils.write_file(downloadfilename,
content,
mode='wb', mode='wb',
continue_on_error=True) continue_on_error=True)
print(f'Downloaded: {downloadfilename}') print(f'Downloaded: {downloadfilename}')
elif downloadfile: elif downloadfile:
print('ERROR: no files to download.') print('ERROR: no files to download.')
cpuStatusReports = _filterCreateReportTime( cpuStatusReports = _filterCreateReportTime(
cros.get('cpuStatusReports', []), cros.get('cpuStatusReports', []), 'reportTime', startDate,
'reportTime',
startDate,
endDate) endDate)
lenCSR = len(cpuStatusReports) lenCSR = len(cpuStatusReports)
if lenCSR: if lenCSR:
@ -284,8 +296,8 @@ def doGetCrosInfo():
temp_label = tempInfo['label'].strip() temp_label = tempInfo['label'].strip()
temperature = tempInfo['temperature'] temperature = tempInfo['temperature']
print(f' {temp_label}: {temperature}') print(f' {temp_label}: {temperature}')
pct_info = cpuStatusReport["cpuUtilizationPercentageInfo"] pct_info = cpuStatusReport['cpuUtilizationPercentageInfo']
util = ",".join([str(x) for x in pct_info]) util = ','.join([str(x) for x in pct_info])
print(f' cpuUtilizationPercentageInfo: {util}') print(f' cpuUtilizationPercentageInfo: {util}')
diskVolumeReports = cros.get('diskVolumeReports', []) diskVolumeReports = cros.get('diskVolumeReports', [])
lenDVR = len(diskVolumeReports) lenDVR = len(diskVolumeReports)
@ -303,16 +315,16 @@ def doGetCrosInfo():
print(f' storageFree: {vstorage_free}') print(f' storageFree: {vstorage_free}')
print(f' storageTotal: {vstorage_total}') print(f' storageTotal: {vstorage_total}')
systemRamFreeReports = _filterCreateReportTime( systemRamFreeReports = _filterCreateReportTime(
cros.get('systemRamFreeReports', []), cros.get('systemRamFreeReports', []), 'reportTime', startDate,
'reportTime', startDate, endDate) endDate)
lenSRFR = len(systemRamFreeReports) lenSRFR = len(systemRamFreeReports)
if lenSRFR: if lenSRFR:
print(' systemRamFreeReports') print(' systemRamFreeReports')
num_ranges = min(lenSRFR, listLimit or lenSRFR) num_ranges = min(lenSRFR, listLimit or lenSRFR)
for systemRamFreeReport in systemRamFreeReports[:num_ranges]: for systemRamFreeReport in systemRamFreeReports[:num_ranges]:
report_time = systemRamFreeReport["reportTime"] report_time = systemRamFreeReport['reportTime']
free_info = systemRamFreeReport["systemRamFreeInfo"] free_info = systemRamFreeReport['systemRamFreeInfo']
free_ram = ",".join(free_info) free_ram = ','.join(free_info)
print(f' reportTime: {report_time}') print(f' reportTime: {report_time}')
print(f' systemRamFreeInfo: {free_ram}') print(f' systemRamFreeInfo: {free_ram}')
@ -320,11 +332,15 @@ def doGetCrosInfo():
def doPrintCrosActivity(): def doPrintCrosActivity():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
todrive = False todrive = False
titles = ['deviceId', 'annotatedAssetId', titles = [
'annotatedLocation', 'serialNumber', 'orgUnitPath'] 'deviceId', 'annotatedAssetId', 'annotatedLocation', 'serialNumber',
'orgUnitPath'
]
csvRows = [] csvRows = []
fieldsList = ['deviceId', 'annotatedAssetId', fieldsList = [
'annotatedLocation', 'serialNumber', 'orgUnitPath'] 'deviceId', 'annotatedAssetId', 'annotatedLocation', 'serialNumber',
'orgUnitPath'
]
startDate = endDate = None startDate = endDate = None
selectActiveTimeRanges = selectDeviceFiles = selectRecentUsers = False selectActiveTimeRanges = selectDeviceFiles = selectRecentUsers = False
listLimit = 0 listLimit = 0
@ -335,10 +351,10 @@ def doPrintCrosActivity():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg in ['query', 'queries']: if myarg in ['query', 'queries']:
queries = gam.getQueries(myarg, sys.argv[i+1]) queries = gam.getQueries(myarg, sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'limittoou': elif myarg == 'limittoou':
orgUnitPath = gam.getOrgUnitItem(sys.argv[i+1]) orgUnitPath = gam.getOrgUnitItem(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'todrive': elif myarg == 'todrive':
todrive = True todrive = True
@ -360,32 +376,35 @@ def doPrintCrosActivity():
selectRecentUsers = True selectRecentUsers = True
i += 1 i += 1
elif myarg in CROS_START_ARGUMENTS: elif myarg in CROS_START_ARGUMENTS:
startDate = _getFilterDate(sys.argv[i+1]) startDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg in CROS_END_ARGUMENTS: elif myarg in CROS_END_ARGUMENTS:
endDate = _getFilterDate(sys.argv[i+1]) endDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'listlimit': elif myarg == 'listlimit':
listLimit = gam.getInteger(sys.argv[i+1], myarg, minVal=0) listLimit = gam.getInteger(sys.argv[i + 1], myarg, minVal=0)
i += 2 i += 2
elif myarg == 'delimiter': elif myarg == 'delimiter':
delimiter = sys.argv[i+1] delimiter = sys.argv[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam print crosactivity") 'gam print crosactivity')
if not selectActiveTimeRanges and \ if not selectActiveTimeRanges and \
not selectDeviceFiles and \ not selectDeviceFiles and \
not selectRecentUsers: not selectRecentUsers:
selectActiveTimeRanges = selectRecentUsers = True selectActiveTimeRanges = selectRecentUsers = True
if selectRecentUsers: if selectRecentUsers:
fieldsList.append('recentUsers') fieldsList.append('recentUsers')
display.add_titles_to_csv_file(['recentUsers.email', ], titles) display.add_titles_to_csv_file([
'recentUsers.email',
], titles)
if selectActiveTimeRanges: if selectActiveTimeRanges:
fieldsList.append('activeTimeRanges') fieldsList.append('activeTimeRanges')
titles_to_add = ['activeTimeRanges.date', titles_to_add = [
'activeTimeRanges.duration', 'activeTimeRanges.date', 'activeTimeRanges.duration',
'activeTimeRanges.minutes'] 'activeTimeRanges.minutes'
]
display.add_titles_to_csv_file(titles_to_add, titles) display.add_titles_to_csv_file(titles_to_add, titles)
if selectDeviceFiles: if selectDeviceFiles:
fieldsList.append('deviceFiles') fieldsList.append('deviceFiles')
@ -395,13 +414,15 @@ def doPrintCrosActivity():
for query in queries: for query in queries:
gam.printGettingAllItems('CrOS Devices', query) gam.printGettingAllItems('CrOS Devices', query)
page_message = gapi.got_total_items_msg('CrOS Devices', '...\n') page_message = gapi.got_total_items_msg('CrOS Devices', '...\n')
all_cros = gapi.get_all_pages(cd.chromeosdevices(), 'list', all_cros = gapi.get_all_pages(cd.chromeosdevices(),
'list',
'chromeosdevices', 'chromeosdevices',
page_message=page_message, page_message=page_message,
query=query, query=query,
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
projection='FULL', projection='FULL',
fields=fields, orgUnitPath=orgUnitPath) fields=fields,
orgUnitPath=orgUnitPath)
for cros in all_cros: for cros in all_cros:
row = {} row = {}
skip_attribs = ['recentUsers', 'activeTimeRanges', 'deviceFiles'] skip_attribs = ['recentUsers', 'activeTimeRanges', 'deviceFiles']
@ -428,9 +449,9 @@ def doPrintCrosActivity():
num_ranges = min(lenRU, listLimit or lenRU) num_ranges = min(lenRU, listLimit or lenRU)
recent_users = [] recent_users = []
for recentUser in recentUsers[:num_ranges]: for recentUser in recentUsers[:num_ranges]:
useremail = recentUser.get("email") useremail = recentUser.get('email')
if not useremail: if not useremail:
if recentUser["type"] == "USER_TYPE_UNMANAGED": if recentUser['type'] == 'USER_TYPE_UNMANAGED':
useremail = 'UnmanagedUser' useremail = 'UnmanagedUser'
else: else:
useremail = 'Unknown' useremail = 'Unknown'
@ -439,8 +460,8 @@ def doPrintCrosActivity():
csvRows.append(row) csvRows.append(row)
if selectDeviceFiles: if selectDeviceFiles:
deviceFiles = _filterCreateReportTime( deviceFiles = _filterCreateReportTime(
cros.get('deviceFiles', []), cros.get('deviceFiles', []), 'createTime', startDate,
'createTime', startDate, endDate) endDate)
lenDF = len(deviceFiles) lenDF = len(deviceFiles)
num_ranges = min(lenDF, listLimit or lenDF) num_ranges = min(lenDF, listLimit or lenDF)
for deviceFile in deviceFiles[:num_ranges]: for deviceFile in deviceFiles[:num_ranges]:
@ -465,6 +486,7 @@ def _checkTPMVulnerability(cros):
def doPrintCrosDevices(): def doPrintCrosDevices():
def _getSelectedLists(myarg): def _getSelectedLists(myarg):
if myarg in CROS_ACTIVE_TIME_RANGES_ARGUMENTS: if myarg in CROS_ACTIVE_TIME_RANGES_ARGUMENTS:
selectedLists['activeTimeRanges'] = True selectedLists['activeTimeRanges'] = True
@ -485,8 +507,8 @@ def doPrintCrosDevices():
fieldsTitles = {} fieldsTitles = {}
titles = [] titles = []
csvRows = [] csvRows = []
display.add_field_to_csv_file( display.add_field_to_csv_file('deviceid', CROS_ARGUMENT_TO_PROPERTY_MAP,
'deviceid', CROS_ARGUMENT_TO_PROPERTY_MAP, fieldsList, fieldsTitles, titles) fieldsList, fieldsTitles, titles)
projection = orderBy = sortOrder = orgUnitPath = None projection = orderBy = sortOrder = orgUnitPath = None
queries = [None] queries = [None]
noLists = sortHeaders = False noLists = sortHeaders = False
@ -497,10 +519,10 @@ def doPrintCrosDevices():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg in ['query', 'queries']: if myarg in ['query', 'queries']:
queries = gam.getQueries(myarg, sys.argv[i+1]) queries = gam.getQueries(myarg, sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'limittoou': elif myarg == 'limittoou':
orgUnitPath = gam.getOrgUnitItem(sys.argv[i+1]) orgUnitPath = gam.getOrgUnitItem(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'todrive': elif myarg == 'todrive':
todrive = True todrive = True
@ -510,21 +532,24 @@ def doPrintCrosDevices():
selectedLists = {} selectedLists = {}
i += 1 i += 1
elif myarg == 'listlimit': elif myarg == 'listlimit':
listLimit = gam.getInteger(sys.argv[i+1], myarg, minVal=0) listLimit = gam.getInteger(sys.argv[i + 1], myarg, minVal=0)
i += 2 i += 2
elif myarg in CROS_START_ARGUMENTS: elif myarg in CROS_START_ARGUMENTS:
startDate = _getFilterDate(sys.argv[i+1]) startDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg in CROS_END_ARGUMENTS: elif myarg in CROS_END_ARGUMENTS:
endDate = _getFilterDate(sys.argv[i+1]) endDate = _getFilterDate(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'orderby': elif myarg == 'orderby':
orderBy = sys.argv[i+1].lower().replace('_', '') orderBy = sys.argv[i + 1].lower().replace('_', '')
validOrderBy = ['location', 'user', 'lastsync', validOrderBy = [
'notes', 'serialnumber', 'status', 'supportenddate'] 'location', 'user', 'lastsync', 'notes', 'serialnumber',
'status', 'supportenddate'
]
if orderBy not in validOrderBy: if orderBy not in validOrderBy:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('orderby',
"orderby", ", ".join(validOrderBy), orderBy) ', '.join(validOrderBy),
orderBy)
if orderBy == 'location': if orderBy == 'location':
orderBy = 'annotatedLocation' orderBy = 'annotatedLocation'
elif orderBy == 'user': elif orderBy == 'user':
@ -559,11 +584,12 @@ def doPrintCrosDevices():
_getSelectedLists(myarg) _getSelectedLists(myarg)
i += 1 i += 1
elif myarg in CROS_ARGUMENT_TO_PROPERTY_MAP: elif myarg in CROS_ARGUMENT_TO_PROPERTY_MAP:
display.add_field_to_fields_list( display.add_field_to_fields_list(myarg,
myarg, CROS_ARGUMENT_TO_PROPERTY_MAP, fieldsList) CROS_ARGUMENT_TO_PROPERTY_MAP,
fieldsList)
i += 1 i += 1
elif myarg == 'fields': elif myarg == 'fields':
fieldNameList = sys.argv[i+1] fieldNameList = sys.argv[i + 1]
for field in fieldNameList.lower().replace(',', ' ').split(): for field in fieldNameList.lower().replace(',', ' ').split():
if field in CROS_LISTS_ARGUMENTS: if field in CROS_LISTS_ARGUMENTS:
_getSelectedLists(field) _getSelectedLists(field)
@ -571,17 +597,18 @@ def doPrintCrosDevices():
display.add_field_to_fields_list( display.add_field_to_fields_list(
field, CROS_ARGUMENT_TO_PROPERTY_MAP, fieldsList) field, CROS_ARGUMENT_TO_PROPERTY_MAP, fieldsList)
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(field,
field, "gam print cros fields") 'gam print cros fields')
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam print cros") controlflow.invalid_argument_exit(sys.argv[i], 'gam print cros')
if selectedLists: if selectedLists:
noLists = False noLists = False
projection = 'FULL' projection = 'FULL'
for selectList in selectedLists: for selectList in selectedLists:
display.add_field_to_fields_list( display.add_field_to_fields_list(selectList,
selectList, CROS_ARGUMENT_TO_PROPERTY_MAP, fieldsList) CROS_ARGUMENT_TO_PROPERTY_MAP,
fieldsList)
if fieldsList: if fieldsList:
fieldsList.append('deviceId') fieldsList.append('deviceId')
fields = f'nextPageToken,chromeosdevices({",".join(set(fieldsList))})'.replace( fields = f'nextPageToken,chromeosdevices({",".join(set(fieldsList))})'.replace(
@ -591,13 +618,16 @@ def doPrintCrosDevices():
for query in queries: for query in queries:
gam.printGettingAllItems('CrOS Devices', query) gam.printGettingAllItems('CrOS Devices', query)
page_message = gapi.got_total_items_msg('CrOS Devices', '...\n') page_message = gapi.got_total_items_msg('CrOS Devices', '...\n')
all_cros = gapi.get_all_pages(cd.chromeosdevices(), 'list', all_cros = gapi.get_all_pages(cd.chromeosdevices(),
'list',
'chromeosdevices', 'chromeosdevices',
page_message=page_message, query=query, page_message=page_message,
query=query,
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
projection=projection, projection=projection,
orgUnitPath=orgUnitPath, orgUnitPath=orgUnitPath,
orderBy=orderBy, sortOrder=sortOrder, orderBy=orderBy,
sortOrder=sortOrder,
fields=fields) fields=fields)
for cros in all_cros: for cros in all_cros:
_checkTPMVulnerability(cros) _checkTPMVulnerability(cros)
@ -612,8 +642,9 @@ def doPrintCrosDevices():
tempInfos = cpuStatusReport.get('cpuTemperatureInfo', []) tempInfos = cpuStatusReport.get('cpuTemperatureInfo', [])
for tempInfo in tempInfos: for tempInfo in tempInfos:
tempInfo['label'] = tempInfo['label'].strip() tempInfo['label'] = tempInfo['label'].strip()
display.add_row_titles_to_csv_file(utils.flatten_json( display.add_row_titles_to_csv_file(
cros, listLimit=listLimit), csvRows, titles) utils.flatten_json(cros, listLimit=listLimit), csvRows,
titles)
continue continue
for cros in all_cros: for cros in all_cros:
if 'notes' in cros: if 'notes' in cros:
@ -623,11 +654,11 @@ def doPrintCrosDevices():
cros['autoUpdateExpiration']) cros['autoUpdateExpiration'])
row = {} row = {}
for attrib in cros: for attrib in cros:
if attrib not in set(['kind', 'etag', 'tpmVersionInfo', if attrib not in set([
'recentUsers', 'activeTimeRanges', 'kind', 'etag', 'tpmVersionInfo', 'recentUsers',
'deviceFiles', 'cpuStatusReports', 'activeTimeRanges', 'deviceFiles', 'cpuStatusReports',
'diskVolumeReports', 'diskVolumeReports', 'systemRamFreeReports'
'systemRamFreeReports']): ]):
row[attrib] = cros[attrib] row[attrib] = cros[attrib]
if selectedLists.get('activeTimeRanges'): if selectedLists.get('activeTimeRanges'):
timergs = cros.get('activeTimeRanges', []) timergs = cros.get('activeTimeRanges', [])
@ -649,8 +680,8 @@ def doPrintCrosDevices():
else: else:
cpu_reports = [] cpu_reports = []
cpuStatusReports = _filterCreateReportTime(cpu_reports, cpuStatusReports = _filterCreateReportTime(cpu_reports,
'reportTime', 'reportTime', startDate,
startDate, endDate) endDate)
if selectedLists.get('diskVolumeReports'): if selectedLists.get('diskVolumeReports'):
diskVolumeReports = cros.get('diskVolumeReports', []) diskVolumeReports = cros.get('diskVolumeReports', [])
else: else:
@ -659,10 +690,8 @@ def doPrintCrosDevices():
ram_reports = cros.get('systemRamFreeReports', []) ram_reports = cros.get('systemRamFreeReports', [])
else: else:
ram_reports = [] ram_reports = []
systemRamFreeReports = _filterCreateReportTime(ram_reports, systemRamFreeReports = _filterCreateReportTime(
'reportTime', ram_reports, 'reportTime', startDate, endDate)
startDate,
endDate)
if noLists or (not activeTimeRanges and \ if noLists or (not activeTimeRanges and \
not recentUsers and \ not recentUsers and \
not deviceFiles and \ not deviceFiles and \
@ -707,7 +736,7 @@ def doPrintCrosDevices():
tempInfos = cpuStatusReports[i].get('cpuTemperatureInfo', tempInfos = cpuStatusReports[i].get('cpuTemperatureInfo',
[]) [])
for tempInfo in tempInfos: for tempInfo in tempInfos:
label = tempInfo["label"].strip() label = tempInfo['label'].strip()
base = 'cpuStatusReports.cpuTemperatureInfo.' base = 'cpuStatusReports.cpuTemperatureInfo.'
nrow[f'{base}{label}'] = tempInfo['temperature'] nrow[f'{base}{label}'] = tempInfo['temperature']
cpu_field = 'cpuUtilizationPercentageInfo' cpu_field = 'cpuUtilizationPercentageInfo'
@ -735,16 +764,18 @@ def doPrintCrosDevices():
','.join(ram_info) ','.join(ram_info)
display.add_row_titles_to_csv_file(nrow, csvRows, titles) display.add_row_titles_to_csv_file(nrow, csvRows, titles)
if sortHeaders: if sortHeaders:
display.sort_csv_titles(['deviceId', ], titles) display.sort_csv_titles([
'deviceId',
], titles)
display.write_csv_file(csvRows, titles, 'CrOS', todrive) display.write_csv_file(csvRows, titles, 'CrOS', todrive)
def getCrOSDeviceEntity(i, cd): def getCrOSDeviceEntity(i, cd):
myarg = sys.argv[i].lower() myarg = sys.argv[i].lower()
if myarg == 'cros_sn': if myarg == 'cros_sn':
return i+2, gam.getUsersToModify('cros_sn', sys.argv[i+1]) return i + 2, gam.getUsersToModify('cros_sn', sys.argv[i + 1])
if myarg == 'query': if myarg == 'query':
return i+2, gam.getUsersToModify('crosquery', sys.argv[i+1]) return i + 2, gam.getUsersToModify('crosquery', sys.argv[i + 1])
if myarg[:6] == 'query:': if myarg[:6] == 'query:':
query = sys.argv[i][6:] query = sys.argv[i][6:]
if query[:12].lower() == 'orgunitpath:': if query[:12].lower() == 'orgunitpath:':
@ -752,12 +783,14 @@ def getCrOSDeviceEntity(i, cd):
else: else:
kwargs = {'query': query} kwargs = {'query': query}
fields = 'nextPageToken,chromeosdevices(deviceId)' fields = 'nextPageToken,chromeosdevices(deviceId)'
devices = gapi.get_all_pages(cd.chromeosdevices(), 'list', devices = gapi.get_all_pages(cd.chromeosdevices(),
'list',
'chromeosdevices', 'chromeosdevices',
customerId=GC_Values[GC_CUSTOMER_ID], customerId=GC_Values[GC_CUSTOMER_ID],
fields=fields, **kwargs) fields=fields,
return i+1, [device['deviceId'] for device in devices] **kwargs)
return i+1, sys.argv[i].replace(',', ' ').split() return i + 1, [device['deviceId'] for device in devices]
return i + 1, sys.argv[i].replace(',', ' ').split()
def _getFilterDate(dateStr): def _getFilterDate(dateStr):
@ -769,8 +802,8 @@ def _filterTimeRanges(activeTimeRanges, startDate, endDate):
return activeTimeRanges return activeTimeRanges
filteredTimeRanges = [] filteredTimeRanges = []
for timeRange in activeTimeRanges: for timeRange in activeTimeRanges:
activityDate = datetime.datetime.strptime( activityDate = datetime.datetime.strptime(timeRange['date'],
timeRange['date'], YYYYMMDD_FORMAT) YYYYMMDD_FORMAT)
if ((startDate is None) or \ if ((startDate is None) or \
(activityDate >= startDate)) and \ (activityDate >= startDate)) and \
((endDate is None) or \ ((endDate is None) or \

View File

@ -9,11 +9,14 @@ from gam.gapi import reports as gapi_reports
def doGetCustomerInfo(): def doGetCustomerInfo():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
customer_info = gapi.call(cd.customers(), 'get', customer_info = gapi.call(cd.customers(),
'get',
customerKey=GC_Values[GC_CUSTOMER_ID]) customerKey=GC_Values[GC_CUSTOMER_ID])
print(f'Customer ID: {customer_info["id"]}') print(f'Customer ID: {customer_info["id"]}')
print(f'Primary Domain: {customer_info["customerDomain"]}') print(f'Primary Domain: {customer_info["customerDomain"]}')
result = gapi.call(cd.domains(), 'get', customer=customer_info['id'], result = gapi.call(cd.domains(),
'get',
customer=customer_info['id'],
domainName=customer_info['customerDomain'], domainName=customer_info['customerDomain'],
fields='verified') fields='verified')
print(f'Primary Domain Verified: {result["verified"]}') print(f'Primary Domain Verified: {result["verified"]}')
@ -23,11 +26,13 @@ def doGetCustomerInfo():
customer_creation = customer_info['customerCreationTime'] customer_creation = customer_info['customerCreationTime']
date_format = '%Y-%m-%dT%H:%M:%S.%fZ' date_format = '%Y-%m-%dT%H:%M:%S.%fZ'
oldest = datetime.datetime.strptime(customer_creation, date_format) oldest = datetime.datetime.strptime(customer_creation, date_format)
domains = gapi.get_items(cd.domains(), 'list', 'domains', domains = gapi.get_items(cd.domains(),
'list',
'domains',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
fields='domains(creationTime)') fields='domains(creationTime)')
for domain in domains: for domain in domains:
creation_timestamp = int(domain['creationTime'])/1000 creation_timestamp = int(domain['creationTime']) / 1000
domain_creation = datetime.datetime.fromtimestamp(creation_timestamp) domain_creation = datetime.datetime.fromtimestamp(creation_timestamp)
if domain_creation < oldest: if domain_creation < oldest:
oldest = domain_creation oldest = domain_creation
@ -64,10 +69,12 @@ def doGetCustomerInfo():
throw_reasons = [gapi.errors.ErrorReason.INVALID] throw_reasons = [gapi.errors.ErrorReason.INVALID]
while True: while True:
try: try:
usage = gapi.get_all_pages(rep.customerUsageReports(), 'get', usage = gapi.get_all_pages(rep.customerUsageReports(),
'get',
'usageReports', 'usageReports',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
customerId=customerId, date=tryDate, customerId=customerId,
date=tryDate,
parameters=parameters) parameters=parameters)
break break
except gapi.errors.GapiInvalidError as e: except gapi.errors.GapiInvalidError as e:
@ -92,22 +99,25 @@ def doUpdateCustomer():
if myarg in ADDRESS_FIELDS_ARGUMENT_MAP: if myarg in ADDRESS_FIELDS_ARGUMENT_MAP:
body.setdefault('postalAddress', {}) body.setdefault('postalAddress', {})
arg = ADDRESS_FIELDS_ARGUMENT_MAP[myarg] arg = ADDRESS_FIELDS_ARGUMENT_MAP[myarg]
body['postalAddress'][arg] = sys.argv[i+1] body['postalAddress'][arg] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['adminsecondaryemail', 'alternateemail']: elif myarg in ['adminsecondaryemail', 'alternateemail']:
body['alternateEmail'] = sys.argv[i+1] body['alternateEmail'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['phone', 'phonenumber']: elif myarg in ['phone', 'phonenumber']:
body['phoneNumber'] = sys.argv[i+1] body['phoneNumber'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'language': elif myarg == 'language':
body['language'] = sys.argv[i+1] body['language'] = sys.argv[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam update customer") controlflow.invalid_argument_exit(myarg, 'gam update customer')
if not body: if not body:
controlflow.system_error_exit(2, 'no arguments specified for "gam ' controlflow.system_error_exit(
'update customer"') 2, 'no arguments specified for "gam '
gapi.call(cd.customers(), 'patch', customerKey=GC_Values[GC_CUSTOMER_ID], 'update customer"')
gapi.call(cd.customers(),
'patch',
customerKey=GC_Values[GC_CUSTOMER_ID],
body=body) body=body)
print('Updated customer') print('Updated customer')

View File

@ -36,15 +36,16 @@ def printBuildings():
fieldsList.append(possible_fields[myarg]) fieldsList.append(possible_fields[myarg])
i += 1 i += 1
# Allows shorter arguments like "name" instead of "buildingname" # Allows shorter arguments like "name" instead of "buildingname"
elif 'building'+myarg in possible_fields: elif 'building' + myarg in possible_fields:
fieldsList.append(possible_fields['building'+myarg]) fieldsList.append(possible_fields['building' + myarg])
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam print buildings") 'gam print buildings')
if fields: if fields:
fields = fields % ','.join(fieldsList) fields = fields % ','.join(fieldsList)
buildings = gapi.get_all_pages(cd.resources().buildings(), 'list', buildings = gapi.get_all_pages(cd.resources().buildings(),
'list',
'buildings', 'buildings',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
fields=fields) fields=fields)
@ -80,7 +81,7 @@ def printResourceCalendars():
todrive = True todrive = True
i += 1 i += 1
elif myarg == 'query': elif myarg == 'query':
query = sys.argv[i+1] query = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'allfields': elif myarg == 'allfields':
fieldsList = [] fieldsList = []
@ -89,8 +90,7 @@ def printResourceCalendars():
for field in RESCAL_ALLFIELDS: for field in RESCAL_ALLFIELDS:
display.add_field_to_csv_file(field, display.add_field_to_csv_file(field,
RESCAL_ARGUMENT_TO_PROPERTY_MAP, RESCAL_ARGUMENT_TO_PROPERTY_MAP,
fieldsList, fieldsTitles, fieldsList, fieldsTitles, titles)
titles)
i += 1 i += 1
elif myarg in RESCAL_ARGUMENT_TO_PROPERTY_MAP: elif myarg in RESCAL_ARGUMENT_TO_PROPERTY_MAP:
display.add_field_to_csv_file(myarg, display.add_field_to_csv_file(myarg,
@ -98,8 +98,8 @@ def printResourceCalendars():
fieldsList, fieldsTitles, titles) fieldsList, fieldsTitles, titles)
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam print resources") 'gam print resources')
if not fieldsList: if not fieldsList:
for field in RESCAL_DFLTFIELDS: for field in RESCAL_DFLTFIELDS:
display.add_field_to_csv_file(field, display.add_field_to_csv_file(field,
@ -107,15 +107,19 @@ def printResourceCalendars():
fieldsList, fieldsTitles, titles) fieldsList, fieldsTitles, titles)
fields = f'nextPageToken,items({",".join(set(fieldsList))})' fields = f'nextPageToken,items({",".join(set(fieldsList))})'
if 'buildingId' in fieldsList: if 'buildingId' in fieldsList:
display.add_field_to_csv_file('buildingName', {'buildingName': [ display.add_field_to_csv_file('buildingName',
'buildingName', ]}, fieldsList, fieldsTitles, titles) {'buildingName': ['buildingName',]},
fieldsList, fieldsTitles, titles)
gam.printGettingAllItems('Resource Calendars', None) gam.printGettingAllItems('Resource Calendars', None)
page_message = gapi.got_total_items_first_last_msg('Resource Calendars') page_message = gapi.got_total_items_first_last_msg('Resource Calendars')
resources = gapi.get_all_pages(cd.resources().calendars(), 'list', resources = gapi.get_all_pages(cd.resources().calendars(),
'items', page_message=page_message, 'list',
'items',
page_message=page_message,
message_attribute='resourceId', message_attribute='resourceId',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
query=query, fields=fields) query=query,
fields=fields)
for resource in resources: for resource in resources:
if 'featureInstances' in resource: if 'featureInstances' in resource:
features = [a_feature['feature']['name'] for \ features = [a_feature['feature']['name'] for \
@ -129,35 +133,50 @@ def printResourceCalendars():
for field in fieldsList: for field in fieldsList:
resUnit[fieldsTitles[field]] = resource.get(field, '') resUnit[fieldsTitles[field]] = resource.get(field, '')
csvRows.append(resUnit) csvRows.append(resUnit)
display.sort_csv_titles( display.sort_csv_titles(['resourceId', 'resourceName', 'resourceEmail'],
['resourceId', 'resourceName', 'resourceEmail'], titles) titles)
display.write_csv_file(csvRows, titles, 'Resources', todrive) display.write_csv_file(csvRows, titles, 'Resources', todrive)
RESCAL_DFLTFIELDS = ['id', 'name', 'email',] RESCAL_DFLTFIELDS = [
RESCAL_ALLFIELDS = ['id', 'name', 'email', 'description', 'type', 'id',
'buildingid', 'category', 'capacity', 'features', 'floor', 'name',
'floorsection', 'generatedresourcename', 'email',
'uservisibledescription',] ]
RESCAL_ALLFIELDS = [
'id',
'name',
'email',
'description',
'type',
'buildingid',
'category',
'capacity',
'features',
'floor',
'floorsection',
'generatedresourcename',
'uservisibledescription',
]
RESCAL_ARGUMENT_TO_PROPERTY_MAP = { RESCAL_ARGUMENT_TO_PROPERTY_MAP = {
'description': ['resourceDescription'], 'description': ['resourceDescription'],
'building': ['buildingId', ], 'building': ['buildingId',],
'buildingid': ['buildingId', ], 'buildingid': ['buildingId',],
'capacity': ['capacity', ], 'capacity': ['capacity',],
'category': ['resourceCategory', ], 'category': ['resourceCategory',],
'email': ['resourceEmail'], 'email': ['resourceEmail'],
'feature': ['featureInstances', ], 'feature': ['featureInstances',],
'features': ['featureInstances', ], 'features': ['featureInstances',],
'floor': ['floorName', ], 'floor': ['floorName',],
'floorname': ['floorName', ], 'floorname': ['floorName',],
'floorsection': ['floorSection', ], 'floorsection': ['floorSection',],
'generatedresourcename': ['generatedResourceName', ], 'generatedresourcename': ['generatedResourceName',],
'id': ['resourceId'], 'id': ['resourceId'],
'name': ['resourceName'], 'name': ['resourceName'],
'type': ['resourceType'], 'type': ['resourceType'],
'userdescription': ['userVisibleDescription', ], 'userdescription': ['userVisibleDescription',],
'uservisibledescription': ['userVisibleDescription', ], 'uservisibledescription': ['userVisibleDescription',],
} }
@ -183,15 +202,15 @@ def printFeatures():
elif myarg in possible_fields: elif myarg in possible_fields:
fieldsList.append(possible_fields[myarg]) fieldsList.append(possible_fields[myarg])
i += 1 i += 1
elif 'feature'+myarg in possible_fields: elif 'feature' + myarg in possible_fields:
fieldsList.append(possible_fields['feature'+myarg]) fieldsList.append(possible_fields['feature' + myarg])
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i], 'gam print features')
sys.argv[i], "gam print features")
if fields: if fields:
fields = fields % ','.join(fieldsList) fields = fields % ','.join(fieldsList)
features = gapi.get_all_pages(cd.resources().features(), 'list', features = gapi.get_all_pages(cd.resources().features(),
'list',
'features', 'features',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
fields=fields) fields=fields)
@ -213,57 +232,62 @@ def _getBuildingAttributes(args, body={}):
while i < len(args): while i < len(args):
myarg = args[i].lower().replace('_', '') myarg = args[i].lower().replace('_', '')
if myarg == 'id': if myarg == 'id':
body['buildingId'] = args[i+1] body['buildingId'] = args[i + 1]
i += 2 i += 2
elif myarg == 'name': elif myarg == 'name':
body['buildingName'] = args[i+1] body['buildingName'] = args[i + 1]
i += 2 i += 2
elif myarg in ['lat', 'latitude']: elif myarg in ['lat', 'latitude']:
if 'coordinates' not in body: if 'coordinates' not in body:
body['coordinates'] = {} body['coordinates'] = {}
body['coordinates']['latitude'] = args[i+1] body['coordinates']['latitude'] = args[i + 1]
i += 2 i += 2
elif myarg in ['long', 'lng', 'longitude']: elif myarg in ['long', 'lng', 'longitude']:
if 'coordinates' not in body: if 'coordinates' not in body:
body['coordinates'] = {} body['coordinates'] = {}
body['coordinates']['longitude'] = args[i+1] body['coordinates']['longitude'] = args[i + 1]
i += 2 i += 2
elif myarg == 'description': elif myarg == 'description':
body['description'] = args[i+1] body['description'] = args[i + 1]
i += 2 i += 2
elif myarg == 'floors': elif myarg == 'floors':
body['floorNames'] = args[i+1].split(',') body['floorNames'] = args[i + 1].split(',')
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(myarg,
myarg, "gam create|update building") 'gam create|update building')
return body return body
def createBuilding(): def createBuilding():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
body = {'floorNames': ['1'], body = {
'buildingId': str(uuid.uuid4()), 'floorNames': ['1'],
'buildingName': sys.argv[3]} 'buildingId': str(uuid.uuid4()),
'buildingName': sys.argv[3]
}
body = _getBuildingAttributes(sys.argv[4:], body) body = _getBuildingAttributes(sys.argv[4:], body)
print(f'Creating building {body["buildingId"]}...') print(f'Creating building {body["buildingId"]}...')
gapi.call(cd.resources().buildings(), 'insert', gapi.call(cd.resources().buildings(),
customer=GC_Values[GC_CUSTOMER_ID], body=body) 'insert',
customer=GC_Values[GC_CUSTOMER_ID],
body=body)
def _makeBuildingIdNameMap(cd): def _makeBuildingIdNameMap(cd):
fields = 'nextPageToken,buildings(buildingId,buildingName)' fields = 'nextPageToken,buildings(buildingId,buildingName)'
buildings = gapi.get_all_pages(cd.resources().buildings(), 'list', buildings = gapi.get_all_pages(cd.resources().buildings(),
'list',
'buildings', 'buildings',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
fields=fields) fields=fields)
GM_Globals[GM_MAP_BUILDING_ID_TO_NAME] = {} GM_Globals[GM_MAP_BUILDING_ID_TO_NAME] = {}
GM_Globals[GM_MAP_BUILDING_NAME_TO_ID] = {} GM_Globals[GM_MAP_BUILDING_NAME_TO_ID] = {}
for building in buildings: for building in buildings:
GM_Globals[GM_MAP_BUILDING_ID_TO_NAME][building['buildingId'] GM_Globals[GM_MAP_BUILDING_ID_TO_NAME][
] = building['buildingName'] building['buildingId']] = building['buildingName']
GM_Globals[GM_MAP_BUILDING_NAME_TO_ID][building['buildingName'] GM_Globals[GM_MAP_BUILDING_NAME_TO_ID][
] = building['buildingId'] building['buildingName']] = building['buildingId']
def getBuildingByNameOrId(cd, which_building, minLen=1): def getBuildingByNameOrId(cd, which_building, minLen=1):
@ -283,10 +307,13 @@ def getBuildingByNameOrId(cd, which_building, minLen=1):
# No exact name match, check for case insensitive name matches # No exact name match, check for case insensitive name matches
which_building_lower = which_building.lower() which_building_lower = which_building.lower()
ci_matches = [] ci_matches = []
for buildingName, buildingId in GM_Globals[GM_MAP_BUILDING_NAME_TO_ID].items(): for buildingName, buildingId in GM_Globals[
GM_MAP_BUILDING_NAME_TO_ID].items():
if buildingName.lower() == which_building_lower: if buildingName.lower() == which_building_lower:
ci_matches.append( ci_matches.append({
{'buildingName': buildingName, 'buildingId': buildingId}) 'buildingName': buildingName,
'buildingId': buildingId
})
# One match, return ID # One match, return ID
if len(ci_matches) == 1: if len(ci_matches) == 1:
return ci_matches[0]['buildingId'] return ci_matches[0]['buildingId']
@ -323,15 +350,18 @@ def updateBuilding():
buildingId = getBuildingByNameOrId(cd, sys.argv[3]) buildingId = getBuildingByNameOrId(cd, sys.argv[3])
body = _getBuildingAttributes(sys.argv[4:]) body = _getBuildingAttributes(sys.argv[4:])
print(f'Updating building {buildingId}...') print(f'Updating building {buildingId}...')
gapi.call(cd.resources().buildings(), 'patch', gapi.call(cd.resources().buildings(),
customer=GC_Values[GC_CUSTOMER_ID], buildingId=buildingId, 'patch',
customer=GC_Values[GC_CUSTOMER_ID],
buildingId=buildingId,
body=body) body=body)
def getBuildingInfo(): def getBuildingInfo():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
buildingId = getBuildingByNameOrId(cd, sys.argv[3]) buildingId = getBuildingByNameOrId(cd, sys.argv[3])
building = gapi.call(cd.resources().buildings(), 'get', building = gapi.call(cd.resources().buildings(),
'get',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
buildingId=buildingId) buildingId=buildingId)
if 'buildingId' in building: if 'buildingId' in building:
@ -347,8 +377,10 @@ def deleteBuilding():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
buildingId = getBuildingByNameOrId(cd, sys.argv[3]) buildingId = getBuildingByNameOrId(cd, sys.argv[3])
print(f'Deleting building {buildingId}...') print(f'Deleting building {buildingId}...')
gapi.call(cd.resources().buildings(), 'delete', gapi.call(cd.resources().buildings(),
customer=GC_Values[GC_CUSTOMER_ID], buildingId=buildingId) 'delete',
customer=GC_Values[GC_CUSTOMER_ID],
buildingId=buildingId)
def _getFeatureAttributes(args, body={}): def _getFeatureAttributes(args, body={}):
@ -356,11 +388,11 @@ def _getFeatureAttributes(args, body={}):
while i < len(args): while i < len(args):
myarg = args[i].lower().replace('_', '') myarg = args[i].lower().replace('_', '')
if myarg == 'name': if myarg == 'name':
body['name'] = args[i+1] body['name'] = args[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(myarg,
myarg, "gam create|update feature") 'gam create|update feature')
return body return body
@ -368,8 +400,10 @@ def createFeature():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
body = _getFeatureAttributes(sys.argv[3:]) body = _getFeatureAttributes(sys.argv[3:])
print(f'Creating feature {body["name"]}...') print(f'Creating feature {body["name"]}...')
gapi.call(cd.resources().features(), 'insert', gapi.call(cd.resources().features(),
customer=GC_Values[GC_CUSTOMER_ID], body=body) 'insert',
customer=GC_Values[GC_CUSTOMER_ID],
body=body)
def updateFeature(): def updateFeature():
@ -380,8 +414,10 @@ def updateFeature():
oldName = sys.argv[3] oldName = sys.argv[3]
body = {'newName': sys.argv[5:]} body = {'newName': sys.argv[5:]}
print(f'Updating feature {oldName}...') print(f'Updating feature {oldName}...')
gapi.call(cd.resources().features(), 'rename', gapi.call(cd.resources().features(),
customer=GC_Values[GC_CUSTOMER_ID], oldName=oldName, 'rename',
customer=GC_Values[GC_CUSTOMER_ID],
oldName=oldName,
body=body) body=body)
@ -389,8 +425,10 @@ def deleteFeature():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
featureKey = sys.argv[3] featureKey = sys.argv[3]
print(f'Deleting feature {featureKey}...') print(f'Deleting feature {featureKey}...')
gapi.call(cd.resources().features(), 'delete', gapi.call(cd.resources().features(),
customer=GC_Values[GC_CUSTOMER_ID], featureKey=featureKey) 'delete',
customer=GC_Values[GC_CUSTOMER_ID],
featureKey=featureKey)
def _getResourceCalendarAttributes(cd, args, body={}): def _getResourceCalendarAttributes(cd, args, body={}):
@ -398,56 +436,58 @@ def _getResourceCalendarAttributes(cd, args, body={}):
while i < len(args): while i < len(args):
myarg = args[i].lower().replace('_', '') myarg = args[i].lower().replace('_', '')
if myarg == 'name': if myarg == 'name':
body['resourceName'] = args[i+1] body['resourceName'] = args[i + 1]
i += 2 i += 2
elif myarg == 'description': elif myarg == 'description':
body['resourceDescription'] = args[i+1].replace('\\n', '\n') body['resourceDescription'] = args[i + 1].replace('\\n', '\n')
i += 2 i += 2
elif myarg == 'type': elif myarg == 'type':
body['resourceType'] = args[i+1] body['resourceType'] = args[i + 1]
i += 2 i += 2
elif myarg in ['building', 'buildingid']: elif myarg in ['building', 'buildingid']:
body['buildingId'] = getBuildingByNameOrId( body['buildingId'] = getBuildingByNameOrId(cd,
cd, args[i+1], minLen=0) args[i + 1],
minLen=0)
i += 2 i += 2
elif myarg in ['capacity']: elif myarg in ['capacity']:
body['capacity'] = gam.getInteger(args[i+1], myarg, minVal=0) body['capacity'] = gam.getInteger(args[i + 1], myarg, minVal=0)
i += 2 i += 2
elif myarg in ['feature', 'features']: elif myarg in ['feature', 'features']:
features = args[i+1].split(',') features = args[i + 1].split(',')
body['featureInstances'] = [] body['featureInstances'] = []
for feature in features: for feature in features:
instance = {'feature': {'name': feature}} instance = {'feature': {'name': feature}}
body['featureInstances'].append(instance) body['featureInstances'].append(instance)
i += 2 i += 2
elif myarg in ['floor', 'floorname']: elif myarg in ['floor', 'floorname']:
body['floorName'] = args[i+1] body['floorName'] = args[i + 1]
i += 2 i += 2
elif myarg in ['floorsection']: elif myarg in ['floorsection']:
body['floorSection'] = args[i+1] body['floorSection'] = args[i + 1]
i += 2 i += 2
elif myarg in ['category']: elif myarg in ['category']:
body['resourceCategory'] = args[i+1].upper() body['resourceCategory'] = args[i + 1].upper()
if body['resourceCategory'] == 'ROOM': if body['resourceCategory'] == 'ROOM':
body['resourceCategory'] = 'CONFERENCE_ROOM' body['resourceCategory'] = 'CONFERENCE_ROOM'
i += 2 i += 2
elif myarg in ['uservisibledescription', 'userdescription']: elif myarg in ['uservisibledescription', 'userdescription']:
body['userVisibleDescription'] = args[i+1] body['userVisibleDescription'] = args[i + 1]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(args[i],
args[i], "gam create|update resource") 'gam create|update resource')
return body return body
def createResourceCalendar(): def createResourceCalendar():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
body = {'resourceId': sys.argv[3], body = {'resourceId': sys.argv[3], 'resourceName': sys.argv[4]}
'resourceName': sys.argv[4]}
body = _getResourceCalendarAttributes(cd, sys.argv[5:], body) body = _getResourceCalendarAttributes(cd, sys.argv[5:], body)
print(f'Creating resource {body["resourceId"]}...') print(f'Creating resource {body["resourceId"]}...')
gapi.call(cd.resources().calendars(), 'insert', gapi.call(cd.resources().calendars(),
customer=GC_Values[GC_CUSTOMER_ID], body=body) 'insert',
customer=GC_Values[GC_CUSTOMER_ID],
body=body)
def updateResourceCalendar(): def updateResourceCalendar():
@ -456,16 +496,20 @@ def updateResourceCalendar():
body = _getResourceCalendarAttributes(cd, sys.argv[4:]) body = _getResourceCalendarAttributes(cd, sys.argv[4:])
# Use patch since it seems to work better. # Use patch since it seems to work better.
# update requires name to be set. # update requires name to be set.
gapi.call(cd.resources().calendars(), 'patch', gapi.call(cd.resources().calendars(),
customer=GC_Values[GC_CUSTOMER_ID], calendarResourceId=resId, 'patch',
body=body, fields='') customer=GC_Values[GC_CUSTOMER_ID],
calendarResourceId=resId,
body=body,
fields='')
print(f'updated resource {resId}') print(f'updated resource {resId}')
def getResourceCalendarInfo(): def getResourceCalendarInfo():
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
resId = sys.argv[3] resId = sys.argv[3]
resource = gapi.call(cd.resources().calendars(), 'get', resource = gapi.call(cd.resources().calendars(),
'get',
customer=GC_Values[GC_CUSTOMER_ID], customer=GC_Values[GC_CUSTOMER_ID],
calendarResourceId=resId) calendarResourceId=resId)
if 'featureInstances' in resource: if 'featureInstances' in resource:
@ -474,8 +518,8 @@ def getResourceCalendarInfo():
features.append(a_feature['feature']['name']) features.append(a_feature['feature']['name'])
resource['features'] = ', '.join(features) resource['features'] = ', '.join(features)
if 'buildingId' in resource: if 'buildingId' in resource:
resource['buildingName'] = getBuildingNameById( resource['buildingName'] = getBuildingNameById(cd,
cd, resource['buildingId']) resource['buildingId'])
resource['buildingId'] = f'id:{resource["buildingId"]}' resource['buildingId'] = f'id:{resource["buildingId"]}'
display.print_json(resource) display.print_json(resource)
@ -484,5 +528,7 @@ def deleteResourceCalendar():
resId = sys.argv[3] resId = sys.argv[3]
cd = gapi_directory.buildGAPIObject() cd = gapi_directory.buildGAPIObject()
print(f'Deleting resource calendar {resId}') print(f'Deleting resource calendar {resId}')
gapi.call(cd.resources().calendars(), 'delete', gapi.call(cd.resources().calendars(),
customer=GC_Values[GC_CUSTOMER_ID], calendarResourceId=resId) 'delete',
customer=GC_Values[GC_CUSTOMER_ID],
calendarResourceId=resId)

View File

@ -9,235 +9,239 @@ from gam.var import UTF8
class GapiAbortedError(Exception): class GapiAbortedError(Exception):
pass pass
class GapiAuthErrorError(Exception): class GapiAuthErrorError(Exception):
pass pass
class GapiBadGatewayError(Exception): class GapiBadGatewayError(Exception):
pass pass
class GapiBadRequestError(Exception): class GapiBadRequestError(Exception):
pass pass
class GapiConditionNotMetError(Exception): class GapiConditionNotMetError(Exception):
pass pass
class GapiCyclicMembershipsNotAllowedError(Exception): class GapiCyclicMembershipsNotAllowedError(Exception):
pass pass
class GapiDomainCannotUseApisError(Exception): class GapiDomainCannotUseApisError(Exception):
pass pass
class GapiDomainNotFoundError(Exception): class GapiDomainNotFoundError(Exception):
pass pass
class GapiDuplicateError(Exception): class GapiDuplicateError(Exception):
pass pass
class GapiFailedPreconditionError(Exception): class GapiFailedPreconditionError(Exception):
pass pass
class GapiForbiddenError(Exception): class GapiForbiddenError(Exception):
pass pass
class GapiGatewayTimeoutError(Exception): class GapiGatewayTimeoutError(Exception):
pass pass
class GapiGroupNotFoundError(Exception): class GapiGroupNotFoundError(Exception):
pass pass
class GapiInvalidError(Exception): class GapiInvalidError(Exception):
pass pass
class GapiInvalidArgumentError(Exception): class GapiInvalidArgumentError(Exception):
pass pass
class GapiInvalidMemberError(Exception): class GapiInvalidMemberError(Exception):
pass pass
class GapiMemberNotFoundError(Exception): class GapiMemberNotFoundError(Exception):
pass pass
class GapiNotFoundError(Exception): class GapiNotFoundError(Exception):
pass pass
class GapiNotImplementedError(Exception): class GapiNotImplementedError(Exception):
pass pass
class GapiPermissionDeniedError(Exception): class GapiPermissionDeniedError(Exception):
pass pass
class GapiResourceNotFoundError(Exception): class GapiResourceNotFoundError(Exception):
pass pass
class GapiServiceNotAvailableError(Exception): class GapiServiceNotAvailableError(Exception):
pass pass
class GapiUserNotFoundError(Exception): class GapiUserNotFoundError(Exception):
pass pass
# GAPI Error Reasons # GAPI Error Reasons
class ErrorReason(Enum): class ErrorReason(Enum):
"""The reason why a non-200 HTTP response was returned from a GAPI.""" """The reason why a non-200 HTTP response was returned from a GAPI."""
ABORTED = 'aborted' ABORTED = 'aborted'
AUTH_ERROR = 'authError' AUTH_ERROR = 'authError'
BACKEND_ERROR = 'backendError' BACKEND_ERROR = 'backendError'
BAD_GATEWAY = 'badGateway' BAD_GATEWAY = 'badGateway'
BAD_REQUEST = 'badRequest' BAD_REQUEST = 'badRequest'
CONDITION_NOT_MET = 'conditionNotMet' CONDITION_NOT_MET = 'conditionNotMet'
CYCLIC_MEMBERSHIPS_NOT_ALLOWED = 'cyclicMembershipsNotAllowed' CYCLIC_MEMBERSHIPS_NOT_ALLOWED = 'cyclicMembershipsNotAllowed'
DAILY_LIMIT_EXCEEDED = 'dailyLimitExceeded' DAILY_LIMIT_EXCEEDED = 'dailyLimitExceeded'
DOMAIN_CANNOT_USE_APIS = 'domainCannotUseApis' DOMAIN_CANNOT_USE_APIS = 'domainCannotUseApis'
DOMAIN_NOT_FOUND = 'domainNotFound' DOMAIN_NOT_FOUND = 'domainNotFound'
DUPLICATE = 'duplicate' DUPLICATE = 'duplicate'
FAILED_PRECONDITION = 'failedPrecondition' FAILED_PRECONDITION = 'failedPrecondition'
FORBIDDEN = 'forbidden' FORBIDDEN = 'forbidden'
FOUR_O_NINE = '409' FOUR_O_NINE = '409'
FOUR_O_THREE = '403' FOUR_O_THREE = '403'
FOUR_TWO_NINE = '429' FOUR_TWO_NINE = '429'
GATEWAY_TIMEOUT = 'gatewayTimeout' GATEWAY_TIMEOUT = 'gatewayTimeout'
GROUP_NOT_FOUND = 'groupNotFound' GROUP_NOT_FOUND = 'groupNotFound'
INTERNAL_ERROR = 'internalError' INTERNAL_ERROR = 'internalError'
INVALID = 'invalid' INVALID = 'invalid'
INVALID_ARGUMENT = 'invalidArgument' INVALID_ARGUMENT = 'invalidArgument'
INVALID_MEMBER = 'invalidMember' INVALID_MEMBER = 'invalidMember'
MEMBER_NOT_FOUND = 'memberNotFound' MEMBER_NOT_FOUND = 'memberNotFound'
NOT_FOUND = 'notFound' NOT_FOUND = 'notFound'
NOT_IMPLEMENTED = 'notImplemented' NOT_IMPLEMENTED = 'notImplemented'
PERMISSION_DENIED = 'permissionDenied' PERMISSION_DENIED = 'permissionDenied'
QUOTA_EXCEEDED = 'quotaExceeded' QUOTA_EXCEEDED = 'quotaExceeded'
RATE_LIMIT_EXCEEDED = 'rateLimitExceeded' RATE_LIMIT_EXCEEDED = 'rateLimitExceeded'
RESOURCE_NOT_FOUND = 'resourceNotFound' RESOURCE_NOT_FOUND = 'resourceNotFound'
SERVICE_NOT_AVAILABLE = 'serviceNotAvailable' SERVICE_NOT_AVAILABLE = 'serviceNotAvailable'
SERVICE_LIMIT = 'serviceLimit' SERVICE_LIMIT = 'serviceLimit'
SYSTEM_ERROR = 'systemError' SYSTEM_ERROR = 'systemError'
USER_NOT_FOUND = 'userNotFound' USER_NOT_FOUND = 'userNotFound'
USER_RATE_LIMIT_EXCEEDED = 'userRateLimitExceeded' USER_RATE_LIMIT_EXCEEDED = 'userRateLimitExceeded'
def __str__(self): def __str__(self):
return str(self.value) return str(self.value)
# Common sets of GAPI error reasons # Common sets of GAPI error reasons
DEFAULT_RETRY_REASONS = [ DEFAULT_RETRY_REASONS = [
ErrorReason.QUOTA_EXCEEDED, ErrorReason.RATE_LIMIT_EXCEEDED, ErrorReason.QUOTA_EXCEEDED,
ErrorReason.USER_RATE_LIMIT_EXCEEDED, ErrorReason.BACKEND_ERROR, ErrorReason.RATE_LIMIT_EXCEEDED,
ErrorReason.BAD_GATEWAY, ErrorReason.GATEWAY_TIMEOUT, ErrorReason.USER_RATE_LIMIT_EXCEEDED,
ErrorReason.INTERNAL_ERROR, ErrorReason.FOUR_TWO_NINE, ErrorReason.BACKEND_ERROR,
] ErrorReason.BAD_GATEWAY,
ErrorReason.GATEWAY_TIMEOUT,
ErrorReason.INTERNAL_ERROR,
ErrorReason.FOUR_TWO_NINE,
]
GMAIL_THROW_REASONS = [ErrorReason.SERVICE_NOT_AVAILABLE] GMAIL_THROW_REASONS = [ErrorReason.SERVICE_NOT_AVAILABLE]
GROUP_GET_THROW_REASONS = [ GROUP_GET_THROW_REASONS = [
ErrorReason.GROUP_NOT_FOUND, ErrorReason.DOMAIN_NOT_FOUND, ErrorReason.GROUP_NOT_FOUND, ErrorReason.DOMAIN_NOT_FOUND,
ErrorReason.DOMAIN_CANNOT_USE_APIS, ErrorReason.FORBIDDEN, ErrorReason.DOMAIN_CANNOT_USE_APIS, ErrorReason.FORBIDDEN,
ErrorReason.BAD_REQUEST ErrorReason.BAD_REQUEST
] ]
GROUP_GET_RETRY_REASONS = [ErrorReason.INVALID, ErrorReason.SYSTEM_ERROR] GROUP_GET_RETRY_REASONS = [ErrorReason.INVALID, ErrorReason.SYSTEM_ERROR]
MEMBERS_THROW_REASONS = [ MEMBERS_THROW_REASONS = [
ErrorReason.GROUP_NOT_FOUND, ErrorReason.DOMAIN_NOT_FOUND, ErrorReason.GROUP_NOT_FOUND, ErrorReason.DOMAIN_NOT_FOUND,
ErrorReason.DOMAIN_CANNOT_USE_APIS, ErrorReason.INVALID, ErrorReason.DOMAIN_CANNOT_USE_APIS, ErrorReason.INVALID,
ErrorReason.FORBIDDEN ErrorReason.FORBIDDEN
] ]
MEMBERS_RETRY_REASONS = [ErrorReason.SYSTEM_ERROR] MEMBERS_RETRY_REASONS = [ErrorReason.SYSTEM_ERROR]
# A map of GAPI error reasons to the corresponding GAM Python Exception # A map of GAPI error reasons to the corresponding GAM Python Exception
ERROR_REASON_TO_EXCEPTION = { ERROR_REASON_TO_EXCEPTION = {
ErrorReason.ABORTED: ErrorReason.ABORTED:
GapiAbortedError, GapiAbortedError,
ErrorReason.AUTH_ERROR: ErrorReason.AUTH_ERROR:
GapiAuthErrorError, GapiAuthErrorError,
ErrorReason.BAD_GATEWAY: ErrorReason.BAD_GATEWAY:
GapiBadGatewayError, GapiBadGatewayError,
ErrorReason.BAD_REQUEST: ErrorReason.BAD_REQUEST:
GapiBadRequestError, GapiBadRequestError,
ErrorReason.CONDITION_NOT_MET: ErrorReason.CONDITION_NOT_MET:
GapiConditionNotMetError, GapiConditionNotMetError,
ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED: ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED:
GapiCyclicMembershipsNotAllowedError, GapiCyclicMembershipsNotAllowedError,
ErrorReason.DOMAIN_CANNOT_USE_APIS: ErrorReason.DOMAIN_CANNOT_USE_APIS:
GapiDomainCannotUseApisError, GapiDomainCannotUseApisError,
ErrorReason.DOMAIN_NOT_FOUND: ErrorReason.DOMAIN_NOT_FOUND:
GapiDomainNotFoundError, GapiDomainNotFoundError,
ErrorReason.DUPLICATE: ErrorReason.DUPLICATE:
GapiDuplicateError, GapiDuplicateError,
ErrorReason.FAILED_PRECONDITION: ErrorReason.FAILED_PRECONDITION:
GapiFailedPreconditionError, GapiFailedPreconditionError,
ErrorReason.FORBIDDEN: ErrorReason.FORBIDDEN:
GapiForbiddenError, GapiForbiddenError,
ErrorReason.GATEWAY_TIMEOUT: ErrorReason.GATEWAY_TIMEOUT:
GapiGatewayTimeoutError, GapiGatewayTimeoutError,
ErrorReason.GROUP_NOT_FOUND: ErrorReason.GROUP_NOT_FOUND:
GapiGroupNotFoundError, GapiGroupNotFoundError,
ErrorReason.INVALID: ErrorReason.INVALID:
GapiInvalidError, GapiInvalidError,
ErrorReason.INVALID_ARGUMENT: ErrorReason.INVALID_ARGUMENT:
GapiInvalidArgumentError, GapiInvalidArgumentError,
ErrorReason.INVALID_MEMBER: ErrorReason.INVALID_MEMBER:
GapiInvalidMemberError, GapiInvalidMemberError,
ErrorReason.MEMBER_NOT_FOUND: ErrorReason.MEMBER_NOT_FOUND:
GapiMemberNotFoundError, GapiMemberNotFoundError,
ErrorReason.NOT_FOUND: ErrorReason.NOT_FOUND:
GapiNotFoundError, GapiNotFoundError,
ErrorReason.NOT_IMPLEMENTED: ErrorReason.NOT_IMPLEMENTED:
GapiNotImplementedError, GapiNotImplementedError,
ErrorReason.PERMISSION_DENIED: ErrorReason.PERMISSION_DENIED:
GapiPermissionDeniedError, GapiPermissionDeniedError,
ErrorReason.RESOURCE_NOT_FOUND: ErrorReason.RESOURCE_NOT_FOUND:
GapiResourceNotFoundError, GapiResourceNotFoundError,
ErrorReason.SERVICE_NOT_AVAILABLE: ErrorReason.SERVICE_NOT_AVAILABLE:
GapiServiceNotAvailableError, GapiServiceNotAvailableError,
ErrorReason.USER_NOT_FOUND: ErrorReason.USER_NOT_FOUND:
GapiUserNotFoundError, GapiUserNotFoundError,
} }
# OAuth Token Errors # OAuth Token Errors
OAUTH2_TOKEN_ERRORS = [ OAUTH2_TOKEN_ERRORS = [
'access_denied', 'access_denied',
'access_denied: Requested client not authorized', 'access_denied: Requested client not authorized',
'internal_failure: Backend Error', 'internal_failure: Backend Error',
'internal_failure: None', 'internal_failure: None',
'invalid_grant', 'invalid_grant',
'invalid_grant: Bad Request', 'invalid_grant: Bad Request',
'invalid_grant: Invalid email or User ID', 'invalid_grant: Invalid email or User ID',
'invalid_grant: Not a valid email', 'invalid_grant: Not a valid email',
'invalid_grant: Invalid JWT: No valid verifier found for issuer', 'invalid_grant: Invalid JWT: No valid verifier found for issuer',
'invalid_grant: The account has been deleted', 'invalid_grant: The account has been deleted',
'invalid_request: Invalid impersonation prn email address', 'invalid_request: Invalid impersonation prn email address',
'invalid_request: Invalid impersonation &quot;sub&quot; field', 'invalid_request: Invalid impersonation &quot;sub&quot; field',
'unauthorized_client: Client is unauthorized to retrieve access tokens ' 'unauthorized_client: Client is unauthorized to retrieve access tokens '
'using this method', 'using this method',
'unauthorized_client: Client is unauthorized to retrieve access tokens ' 'unauthorized_client: Client is unauthorized to retrieve access tokens '
'using this method, or client not authorized for any of the scopes ' 'using this method, or client not authorized for any of the scopes '
'requested', 'requested',
'unauthorized_client: Unauthorized client or scope in request', 'unauthorized_client: Unauthorized client or scope in request',
] ]
def _create_http_error_dict(status_code, reason, message): def _create_http_error_dict(status_code, reason, message):
"""Creates a basic error dict similar to most Google API Errors. """Creates a basic error dict similar to most Google API Errors.
Args: Args:
status_code: Int, the error's HTTP response status code. status_code: Int, the error's HTTP response status code.
@ -247,22 +251,22 @@ def _create_http_error_dict(status_code, reason, message):
Returns: Returns:
dict dict
""" """
return { return {
'error': { 'error': {
'code': status_code, 'code': status_code,
'errors': [{ 'errors': [{
'reason': str(reason), 'reason': str(reason),
'message': message, 'message': message,
}] }]
} }
} }
def get_gapi_error_detail(e, def get_gapi_error_detail(e,
soft_errors=False, soft_errors=False,
silent_errors=False, silent_errors=False,
retry_on_http_error=False): retry_on_http_error=False):
"""Extracts error detail from a non-200 GAPI Response. """Extracts error detail from a non-200 GAPI Response.
Args: Args:
e: googleapiclient.HttpError, The HTTP Error received. e: googleapiclient.HttpError, The HTTP Error received.
@ -278,88 +282,93 @@ def get_gapi_error_detail(e,
A tuple containing the HTTP Response code, GAPI error reason, and error A tuple containing the HTTP Response code, GAPI error reason, and error
message. message.
""" """
try:
error = json.loads(e.content.decode(UTF8))
except ValueError:
error_content = e.content.decode(UTF8) if isinstance(e.content,
bytes) else e.content
if (e.resp['status'] == '503') and (
error_content == 'Quota exceeded for the current request'):
return (e.resp['status'], ErrorReason.QUOTA_EXCEEDED.value, error_content)
if (e.resp['status'] == '403') and (
error_content.startswith('Request rate higher than configured')):
return (e.resp['status'], ErrorReason.QUOTA_EXCEEDED.value, error_content)
if (e.resp['status'] == '502') and ('Bad Gateway' in error_content):
return (e.resp['status'], ErrorReason.BAD_GATEWAY.value, error_content)
if (e.resp['status'] == '504') and ('Gateway Timeout' in error_content):
return (e.resp['status'], ErrorReason.GATEWAY_TIMEOUT.value, error_content)
if (e.resp['status'] == '403') and ('Invalid domain.' in error_content):
error = _create_http_error_dict(403, ErrorReason.NOT_FOUND.value,
'Domain not found')
elif (e.resp['status'] == '400') and (
'InvalidSsoSigningKey' in error_content):
error = _create_http_error_dict(400, ErrorReason.INVALID.value,
'InvalidSsoSigningKey')
elif (e.resp['status'] == '400') and ('UnknownError' in error_content):
error = _create_http_error_dict(400, ErrorReason.INVALID.value,
'UnknownError')
elif retry_on_http_error:
return (-1, None, None)
elif soft_errors:
if not silent_errors:
display.print_error(error_content)
return (0, None, None)
else:
controlflow.system_error_exit(5, error_content)
# END: ValueError catch
if 'error' in error:
http_status = error['error']['code']
try: try:
message = error['error']['errors'][0]['message'] error = json.loads(e.content.decode(UTF8))
except KeyError: except ValueError:
message = error['error']['message'] error_content = e.content.decode(UTF8) if isinstance(
else: e.content, bytes) else e.content
if 'error_description' in error: if (e.resp['status'] == '503') and (
if error['error_description'] == 'Invalid Value': error_content == 'Quota exceeded for the current request'):
message = error['error_description'] return (e.resp['status'], ErrorReason.QUOTA_EXCEEDED.value,
http_status = 400 error_content)
error = _create_http_error_dict(400, ErrorReason.INVALID.value, message) if (e.resp['status'] == '403') and (error_content.startswith(
else: 'Request rate higher than configured')):
controlflow.system_error_exit(4, str(error)) return (e.resp['status'], ErrorReason.QUOTA_EXCEEDED.value,
else: error_content)
controlflow.system_error_exit(4, str(error)) if (e.resp['status'] == '502') and ('Bad Gateway' in error_content):
return (e.resp['status'], ErrorReason.BAD_GATEWAY.value,
error_content)
if (e.resp['status'] == '504') and ('Gateway Timeout' in error_content):
return (e.resp['status'], ErrorReason.GATEWAY_TIMEOUT.value,
error_content)
if (e.resp['status'] == '403') and ('Invalid domain.' in error_content):
error = _create_http_error_dict(403, ErrorReason.NOT_FOUND.value,
'Domain not found')
elif (e.resp['status'] == '400') and (
'InvalidSsoSigningKey' in error_content):
error = _create_http_error_dict(400, ErrorReason.INVALID.value,
'InvalidSsoSigningKey')
elif (e.resp['status'] == '400') and ('UnknownError' in error_content):
error = _create_http_error_dict(400, ErrorReason.INVALID.value,
'UnknownError')
elif retry_on_http_error:
return (-1, None, None)
elif soft_errors:
if not silent_errors:
display.print_error(error_content)
return (0, None, None)
else:
controlflow.system_error_exit(5, error_content)
# END: ValueError catch
# Extract the error reason if 'error' in error:
try: http_status = error['error']['code']
reason = error['error']['errors'][0]['reason'] try:
if reason == 'notFound': message = error['error']['errors'][0]['message']
if 'userKey' in message: except KeyError:
reason = ErrorReason.USER_NOT_FOUND.value message = error['error']['message']
elif 'groupKey' in message: else:
reason = ErrorReason.GROUP_NOT_FOUND.value if 'error_description' in error:
elif 'memberKey' in message: if error['error_description'] == 'Invalid Value':
reason = ErrorReason.MEMBER_NOT_FOUND.value message = error['error_description']
elif 'Domain not found' in message: http_status = 400
reason = ErrorReason.DOMAIN_NOT_FOUND.value error = _create_http_error_dict(400, ErrorReason.INVALID.value,
elif 'Resource Not Found' in message: message)
reason = ErrorReason.RESOURCE_NOT_FOUND.value else:
elif reason == 'invalid': controlflow.system_error_exit(4, str(error))
if 'userId' in message: else:
reason = ErrorReason.USER_NOT_FOUND.value controlflow.system_error_exit(4, str(error))
elif 'memberKey' in message:
reason = ErrorReason.INVALID_MEMBER.value # Extract the error reason
elif reason == 'failedPrecondition': try:
if 'Bad Request' in message: reason = error['error']['errors'][0]['reason']
reason = ErrorReason.BAD_REQUEST.value if reason == 'notFound':
elif 'Mail service not enabled' in message: if 'userKey' in message:
reason = ErrorReason.SERVICE_NOT_AVAILABLE.value reason = ErrorReason.USER_NOT_FOUND.value
elif reason == 'required': elif 'groupKey' in message:
if 'memberKey' in message: reason = ErrorReason.GROUP_NOT_FOUND.value
reason = ErrorReason.MEMBER_NOT_FOUND.value elif 'memberKey' in message:
elif reason == 'conditionNotMet': reason = ErrorReason.MEMBER_NOT_FOUND.value
if 'Cyclic memberships not allowed' in message: elif 'Domain not found' in message:
reason = ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED.value reason = ErrorReason.DOMAIN_NOT_FOUND.value
except KeyError: elif 'Resource Not Found' in message:
reason = f'{http_status}' reason = ErrorReason.RESOURCE_NOT_FOUND.value
return (http_status, reason, message) elif reason == 'invalid':
if 'userId' in message:
reason = ErrorReason.USER_NOT_FOUND.value
elif 'memberKey' in message:
reason = ErrorReason.INVALID_MEMBER.value
elif reason == 'failedPrecondition':
if 'Bad Request' in message:
reason = ErrorReason.BAD_REQUEST.value
elif 'Mail service not enabled' in message:
reason = ErrorReason.SERVICE_NOT_AVAILABLE.value
elif reason == 'required':
if 'memberKey' in message:
reason = ErrorReason.MEMBER_NOT_FOUND.value
elif reason == 'conditionNotMet':
if 'Cyclic memberships not allowed' in message:
reason = ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED.value
except KeyError:
reason = f'{http_status}'
return (http_status, reason, message)

View File

@ -10,200 +10,201 @@ from gam.gapi import errors
def create_simple_http_error(status, reason, message): def create_simple_http_error(status, reason, message):
content = errors._create_http_error_dict(status, reason, message) content = errors._create_http_error_dict(status, reason, message)
return create_http_error(status, content) return create_http_error(status, content)
def create_http_error(status, content): def create_http_error(status, content):
response = { response = {
'status': status, 'status': status,
'content-type': 'application/json', 'content-type': 'application/json',
} }
content_as_bytes = json.dumps(content).encode('UTF-8') content_as_bytes = json.dumps(content).encode('UTF-8')
return googleapiclient.errors.HttpError(response, content_as_bytes) return googleapiclient.errors.HttpError(response, content_as_bytes)
class ErrorsTest(unittest.TestCase): class ErrorsTest(unittest.TestCase):
def test_get_gapi_error_detail_quota_exceeded(self): def test_get_gapi_error_detail_quota_exceeded(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_detail_invalid_domain(self): def test_get_gapi_error_detail_invalid_domain(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_detail_invalid_signing_key(self): def test_get_gapi_error_detail_invalid_signing_key(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_detail_unknown_error(self): def test_get_gapi_error_detail_unknown_error(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_retry_http_error(self): def test_get_gapi_error_retry_http_error(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_prints_soft_errors(self): def test_get_gapi_error_prints_soft_errors(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_exits_on_unrecoverable_errors(self): def test_get_gapi_error_exits_on_unrecoverable_errors(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_quota_exceeded_for_current_request(self): def test_get_gapi_error_quota_exceeded_for_current_request(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_quota_exceeded_high_request_rate(self): def test_get_gapi_error_quota_exceeded_high_request_rate(self):
# TODO: Add test logic once the opening ValueError exception case has a # TODO: Add test logic once the opening ValueError exception case has a
# repro case (i.e. an Exception type/format that will cause it to raise). # repro case (i.e. an Exception type/format that will cause it to raise).
pass pass
def test_get_gapi_error_extracts_user_not_found(self): def test_get_gapi_error_extracts_user_not_found(self):
err = create_simple_http_error(404, 'notFound', err = create_simple_http_error(404, 'notFound',
'Resource Not Found: userKey.') 'Resource Not Found: userKey.')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 404) self.assertEqual(http_status, 404)
self.assertEqual(reason, errors.ErrorReason.USER_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.USER_NOT_FOUND.value)
self.assertEqual(message, 'Resource Not Found: userKey.') self.assertEqual(message, 'Resource Not Found: userKey.')
def test_get_gapi_error_extracts_group_not_found(self): def test_get_gapi_error_extracts_group_not_found(self):
err = create_simple_http_error(404, 'notFound', err = create_simple_http_error(404, 'notFound',
'Resource Not Found: groupKey.') 'Resource Not Found: groupKey.')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 404) self.assertEqual(http_status, 404)
self.assertEqual(reason, errors.ErrorReason.GROUP_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.GROUP_NOT_FOUND.value)
self.assertEqual(message, 'Resource Not Found: groupKey.') self.assertEqual(message, 'Resource Not Found: groupKey.')
def test_get_gapi_error_extracts_member_not_found(self): def test_get_gapi_error_extracts_member_not_found(self):
err = create_simple_http_error(404, 'notFound', err = create_simple_http_error(404, 'notFound',
'Resource Not Found: memberKey.') 'Resource Not Found: memberKey.')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 404) self.assertEqual(http_status, 404)
self.assertEqual(reason, errors.ErrorReason.MEMBER_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.MEMBER_NOT_FOUND.value)
self.assertEqual(message, 'Resource Not Found: memberKey.') self.assertEqual(message, 'Resource Not Found: memberKey.')
def test_get_gapi_error_extracts_domain_not_found(self): def test_get_gapi_error_extracts_domain_not_found(self):
err = create_simple_http_error(404, 'notFound', 'Domain not found.') err = create_simple_http_error(404, 'notFound', 'Domain not found.')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 404) self.assertEqual(http_status, 404)
self.assertEqual(reason, errors.ErrorReason.DOMAIN_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.DOMAIN_NOT_FOUND.value)
self.assertEqual(message, 'Domain not found.') self.assertEqual(message, 'Domain not found.')
def test_get_gapi_error_extracts_generic_resource_not_found(self): def test_get_gapi_error_extracts_generic_resource_not_found(self):
err = create_simple_http_error(404, 'notFound', err = create_simple_http_error(404, 'notFound',
'Resource Not Found: unknownResource.') 'Resource Not Found: unknownResource.')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 404) self.assertEqual(http_status, 404)
self.assertEqual(reason, errors.ErrorReason.RESOURCE_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.RESOURCE_NOT_FOUND.value)
self.assertEqual(message, 'Resource Not Found: unknownResource.') self.assertEqual(message, 'Resource Not Found: unknownResource.')
def test_get_gapi_error_extracts_invalid_userid(self): def test_get_gapi_error_extracts_invalid_userid(self):
err = create_simple_http_error(400, 'invalid', 'Invalid Input: userId') err = create_simple_http_error(400, 'invalid', 'Invalid Input: userId')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 400) self.assertEqual(http_status, 400)
self.assertEqual(reason, errors.ErrorReason.USER_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.USER_NOT_FOUND.value)
self.assertEqual(message, 'Invalid Input: userId') self.assertEqual(message, 'Invalid Input: userId')
def test_get_gapi_error_extracts_invalid_member(self): def test_get_gapi_error_extracts_invalid_member(self):
err = create_simple_http_error(400, 'invalid', 'Invalid Input: memberKey') err = create_simple_http_error(400, 'invalid',
http_status, reason, message = errors.get_gapi_error_detail(err) 'Invalid Input: memberKey')
self.assertEqual(http_status, 400) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(reason, errors.ErrorReason.INVALID_MEMBER.value) self.assertEqual(http_status, 400)
self.assertEqual(message, 'Invalid Input: memberKey') self.assertEqual(reason, errors.ErrorReason.INVALID_MEMBER.value)
self.assertEqual(message, 'Invalid Input: memberKey')
def test_get_gapi_error_extracts_bad_request(self): def test_get_gapi_error_extracts_bad_request(self):
err = create_simple_http_error(400, 'failedPrecondition', 'Bad Request') err = create_simple_http_error(400, 'failedPrecondition', 'Bad Request')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 400) self.assertEqual(http_status, 400)
self.assertEqual(reason, errors.ErrorReason.BAD_REQUEST.value) self.assertEqual(reason, errors.ErrorReason.BAD_REQUEST.value)
self.assertEqual(message, 'Bad Request') self.assertEqual(message, 'Bad Request')
def test_get_gapi_error_extracts_service_not_available(self): def test_get_gapi_error_extracts_service_not_available(self):
err = create_simple_http_error(400, 'failedPrecondition', err = create_simple_http_error(400, 'failedPrecondition',
'Mail service not enabled') 'Mail service not enabled')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 400) self.assertEqual(http_status, 400)
self.assertEqual(reason, errors.ErrorReason.SERVICE_NOT_AVAILABLE.value) self.assertEqual(reason, errors.ErrorReason.SERVICE_NOT_AVAILABLE.value)
self.assertEqual(message, 'Mail service not enabled') self.assertEqual(message, 'Mail service not enabled')
def test_get_gapi_error_extracts_required_member_not_found(self): def test_get_gapi_error_extracts_required_member_not_found(self):
err = create_simple_http_error(400, 'required', err = create_simple_http_error(400, 'required',
'Missing required field: memberKey') 'Missing required field: memberKey')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 400) self.assertEqual(http_status, 400)
self.assertEqual(reason, errors.ErrorReason.MEMBER_NOT_FOUND.value) self.assertEqual(reason, errors.ErrorReason.MEMBER_NOT_FOUND.value)
self.assertEqual(message, 'Missing required field: memberKey') self.assertEqual(message, 'Missing required field: memberKey')
def test_get_gapi_error_extracts_cyclic_memberships_error(self): def test_get_gapi_error_extracts_cyclic_memberships_error(self):
err = create_simple_http_error(400, 'conditionNotMet', err = create_simple_http_error(400, 'conditionNotMet',
'Cyclic memberships not allowed') 'Cyclic memberships not allowed')
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, 400) self.assertEqual(http_status, 400)
self.assertEqual(reason, self.assertEqual(
errors.ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED.value) reason, errors.ErrorReason.CYCLIC_MEMBERSHIPS_NOT_ALLOWED.value)
self.assertEqual(message, 'Cyclic memberships not allowed') self.assertEqual(message, 'Cyclic memberships not allowed')
def test_get_gapi_error_extracts_single_error_with_message(self): def test_get_gapi_error_extracts_single_error_with_message(self):
status_code = 999 status_code = 999
response = {'status': status_code} response = {'status': status_code}
# This error does not have an "errors" key describing each error. # This error does not have an "errors" key describing each error.
content = {'error': {'code': status_code, 'message': 'unknown error'}} content = {'error': {'code': status_code, 'message': 'unknown error'}}
content_as_bytes = json.dumps(content).encode('UTF-8') content_as_bytes = json.dumps(content).encode('UTF-8')
err = googleapiclient.errors.HttpError(response, content_as_bytes) err = googleapiclient.errors.HttpError(response, content_as_bytes)
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, status_code) self.assertEqual(http_status, status_code)
self.assertEqual(reason, str(status_code)) self.assertEqual(reason, str(status_code))
self.assertEqual(message, content['error']['message']) self.assertEqual(message, content['error']['message'])
def test_get_gapi_error_exits_code_4_on_malformed_error_with_unknown_description( def test_get_gapi_error_exits_code_4_on_malformed_error_with_unknown_description(
self): self):
status_code = 999 status_code = 999
response = {'status': status_code} response = {'status': status_code}
# This error only has an error_description_field and an unknown description. # This error only has an error_description_field and an unknown description.
content = {'error_description': 'something errored'} content = {'error_description': 'something errored'}
content_as_bytes = json.dumps(content).encode('UTF-8') content_as_bytes = json.dumps(content).encode('UTF-8')
err = googleapiclient.errors.HttpError(response, content_as_bytes) err = googleapiclient.errors.HttpError(response, content_as_bytes)
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
errors.get_gapi_error_detail(err) errors.get_gapi_error_detail(err)
self.assertEqual(4, context.exception.code) self.assertEqual(4, context.exception.code)
def test_get_gapi_error_exits_on_invalid_error_description(self): def test_get_gapi_error_exits_on_invalid_error_description(self):
status_code = 400 status_code = 400
response = {'status': status_code} response = {'status': status_code}
content = {'error_description': 'Invalid Value'} content = {'error_description': 'Invalid Value'}
content_as_bytes = json.dumps(content).encode('UTF-8') content_as_bytes = json.dumps(content).encode('UTF-8')
err = googleapiclient.errors.HttpError(response, content_as_bytes) err = googleapiclient.errors.HttpError(response, content_as_bytes)
http_status, reason, message = errors.get_gapi_error_detail(err) http_status, reason, message = errors.get_gapi_error_detail(err)
self.assertEqual(http_status, status_code) self.assertEqual(http_status, status_code)
self.assertEqual(reason, errors.ErrorReason.INVALID.value) self.assertEqual(reason, errors.ErrorReason.INVALID.value)
self.assertEqual(message, 'Invalid Value') self.assertEqual(message, 'Invalid Value')
def test_get_gapi_error_exits_code_4_on_unexpected_error_contents(self): def test_get_gapi_error_exits_code_4_on_unexpected_error_contents(self):
status_code = 900 status_code = 900
response = {'status': status_code} response = {'status': status_code}
content = {'notErrorContentThatIsExpected': 'foo'} content = {'notErrorContentThatIsExpected': 'foo'}
content_as_bytes = json.dumps(content).encode('UTF-8') content_as_bytes = json.dumps(content).encode('UTF-8')
err = googleapiclient.errors.HttpError(response, content_as_bytes) err = googleapiclient.errors.HttpError(response, content_as_bytes)
with self.assertRaises(SystemExit) as context: with self.assertRaises(SystemExit) as context:
errors.get_gapi_error_detail(err) errors.get_gapi_error_detail(err)
self.assertEqual(4, context.exception.code) self.assertEqual(4, context.exception.code)
if __name__ == '__main__': if __name__ == '__main__':
unittest.main() unittest.main()

View File

@ -42,12 +42,13 @@ REPORT_CHOICE_MAP = {
def showUsageParameters(): def showUsageParameters():
rep = buildGAPIObject() rep = buildGAPIObject()
throw_reasons = [gapi.errors.ErrorReason.INVALID, throw_reasons = [
gapi.errors.ErrorReason.BAD_REQUEST] gapi.errors.ErrorReason.INVALID, gapi.errors.ErrorReason.BAD_REQUEST
]
todrive = False todrive = False
if len(sys.argv) == 3: if len(sys.argv) == 3:
controlflow.missing_argument_exit( controlflow.missing_argument_exit('user or customer',
'user or customer', 'report usageparameters') 'report usageparameters')
report = sys.argv[3].lower() report = sys.argv[3].lower()
titles = ['parameter'] titles = ['parameter']
if report == 'customer': if report == 'customer':
@ -57,8 +58,8 @@ def showUsageParameters():
endpoint = rep.userUsageReport() endpoint = rep.userUsageReport()
kwargs = {'userKey': gam._getValueFromOAuth('email')} kwargs = {'userKey': gam._getValueFromOAuth('email')}
else: else:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('usageparameters',
'usageparameters', ['user', 'customer'], report) ['user', 'customer'], report)
customerId = GC_Values[GC_CUSTOMER_ID] customerId = GC_Values[GC_CUSTOMER_ID]
if customerId == MY_CUSTOMER: if customerId == MY_CUSTOMER:
customerId = None customerId = None
@ -73,10 +74,12 @@ def showUsageParameters():
todrive = True todrive = True
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam report usageparameters") controlflow.invalid_argument_exit(sys.argv[i],
'gam report usageparameters')
while True: while True:
try: try:
response = gapi.call(endpoint, 'get', response = gapi.call(endpoint,
'get',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
date=tryDate, date=tryDate,
customerId=customerId, customerId=customerId,
@ -87,7 +90,9 @@ def showUsageParameters():
if data.get('key') == 'application': if data.get('key') == 'application':
partial_on_thisday.append(data['value']) partial_on_thisday.append(data['value'])
if partial_apps: if partial_apps:
partial_apps = [app for app in partial_apps if app in partial_on_thisday] partial_apps = [
app for app in partial_apps if app in partial_on_thisday
]
else: else:
partial_apps = partial_on_thisday partial_apps = partial_on_thisday
for parameter in response['usageReports'][0]['parameters']: for parameter in response['usageReports'][0]['parameters']:
@ -104,19 +109,24 @@ def showUsageParameters():
csvRows = [] csvRows = []
for parameter in all_parameters: for parameter in all_parameters:
csvRows.append({'parameter': parameter}) csvRows.append({'parameter': parameter})
display.write_csv_file( display.write_csv_file(csvRows, titles,
csvRows, titles, f'{report.capitalize()} Report Usage Parameters', todrive) f'{report.capitalize()} Report Usage Parameters',
todrive)
REPORTS_PARAMETERS_SIMPLE_TYPES = [
'intValue', 'boolValue', 'datetimeValue', 'stringValue'
]
REPORTS_PARAMETERS_SIMPLE_TYPES = ['intValue', 'boolValue', 'datetimeValue', 'stringValue']
def showUsage(): def showUsage():
rep = buildGAPIObject() rep = buildGAPIObject()
throw_reasons = [gapi.errors.ErrorReason.INVALID, throw_reasons = [
gapi.errors.ErrorReason.BAD_REQUEST] gapi.errors.ErrorReason.INVALID, gapi.errors.ErrorReason.BAD_REQUEST
]
todrive = False todrive = False
if len(sys.argv) == 3: if len(sys.argv) == 3:
controlflow.missing_argument_exit( controlflow.missing_argument_exit('user or customer', 'report usage')
'user or customer', 'report usage')
report = sys.argv[3].lower() report = sys.argv[3].lower()
titles = ['date'] titles = ['date']
if report == 'customer': if report == 'customer':
@ -127,8 +137,8 @@ def showUsage():
kwargs = [{'userKey': 'all'}] kwargs = [{'userKey': 'all'}]
titles.append('user') titles.append('user')
else: else:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('usage', ['user', 'customer'],
'usage', ['user', 'customer'], report) report)
customerId = GC_Values[GC_CUSTOMER_ID] customerId = GC_Values[GC_CUSTOMER_ID]
if customerId == MY_CUSTOMER: if customerId == MY_CUSTOMER:
customerId = None customerId = None
@ -141,43 +151,47 @@ def showUsage():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'startdate': if myarg == 'startdate':
start_date = utils.get_yyyymmdd(sys.argv[i+1], returnDateTime=True) start_date = utils.get_yyyymmdd(sys.argv[i + 1],
returnDateTime=True)
i += 2 i += 2
elif myarg == 'enddate': elif myarg == 'enddate':
end_date = utils.get_yyyymmdd(sys.argv[i+1], returnDateTime=True) end_date = utils.get_yyyymmdd(sys.argv[i + 1], returnDateTime=True)
i += 2 i += 2
elif myarg == 'todrive': elif myarg == 'todrive':
todrive = True todrive = True
i += 1 i += 1
elif myarg in ['fields', 'parameters']: elif myarg in ['fields', 'parameters']:
parameters = sys.argv[i+1].split(',') parameters = sys.argv[i + 1].split(',')
i += 2 i += 2
elif myarg == 'skipdates': elif myarg == 'skipdates':
for skip in sys.argv[i+1].split(','): for skip in sys.argv[i + 1].split(','):
if skip.find(':') == -1: if skip.find(':') == -1:
skip_dates.add(utils.get_yyyymmdd(skip, returnDateTime=True)) skip_dates.add(utils.get_yyyymmdd(skip,
returnDateTime=True))
else: else:
skip_start, skip_end = skip.split(':', 1) skip_start, skip_end = skip.split(':', 1)
skip_start = utils.get_yyyymmdd(skip_start, returnDateTime=True) skip_start = utils.get_yyyymmdd(skip_start,
returnDateTime=True)
skip_end = utils.get_yyyymmdd(skip_end, returnDateTime=True) skip_end = utils.get_yyyymmdd(skip_end, returnDateTime=True)
while skip_start <= skip_end: while skip_start <= skip_end:
skip_dates.add(skip_start) skip_dates.add(skip_start)
skip_start += one_day skip_start += one_day
i += 2 i += 2
elif myarg == 'skipdaysofweek': elif myarg == 'skipdaysofweek':
skipdaynames = sys.argv[i+1].split(',') skipdaynames = sys.argv[i + 1].split(',')
dow = [d.lower() for d in calendar.day_abbr] dow = [d.lower() for d in calendar.day_abbr]
skip_day_numbers = [dow.index(d) for d in skipdaynames if d in dow] skip_day_numbers = [dow.index(d) for d in skipdaynames if d in dow]
i += 2 i += 2
elif report == 'user' and myarg in ['orgunit', 'org', 'ou']: elif report == 'user' and myarg in ['orgunit', 'org', 'ou']:
_, orgUnitId = gam.getOrgUnitId(sys.argv[i+1]) _, orgUnitId = gam.getOrgUnitId(sys.argv[i + 1])
i += 2 i += 2
elif report == 'user' and myarg in usergroup_types: elif report == 'user' and myarg in usergroup_types:
users = gam.getUsersToModify(myarg, sys.argv[i+1]) users = gam.getUsersToModify(myarg, sys.argv[i + 1])
kwargs = [{'userKey': user} for user in users] kwargs = [{'userKey': user} for user in users]
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], f'gam report usage {report}') controlflow.invalid_argument_exit(sys.argv[i],
f'gam report usage {report}')
if parameters: if parameters:
titles.extend(parameters) titles.extend(parameters)
parameters = ','.join(parameters) parameters = ','.join(parameters)
@ -206,7 +220,8 @@ def showUsage():
try: try:
for kwarg in kwargs: for kwarg in kwargs:
try: try:
usage = gapi.get_all_pages(endpoint, 'get', usage = gapi.get_all_pages(endpoint,
'get',
'usageReports', 'usageReports',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
customerId=customerId, customerId=customerId,
@ -250,8 +265,7 @@ def showUsage():
report_name = f'{report.capitalize()} Usage Report - {start_use_date}:{end_use_date}' report_name = f'{report.capitalize()} Usage Report - {start_use_date}:{end_use_date}'
else: else:
report_name = f'{report.capitalize()} Usage Report - {start_date}:{end_date} - No Data' report_name = f'{report.capitalize()} Usage Report - {start_date}:{end_date} - No Data'
display.write_csv_file( display.write_csv_file(csvRows, titles, report_name, todrive)
csvRows, titles, report_name, todrive)
def showReport(): def showReport():
@ -260,17 +274,18 @@ def showReport():
report = sys.argv[2].lower() report = sys.argv[2].lower()
report = REPORT_CHOICE_MAP.get(report.replace('_', ''), report) report = REPORT_CHOICE_MAP.get(report.replace('_', ''), report)
if report == 'usage': if report == 'usage':
showUsage() showUsage()
return return
if report == 'usageparameters': if report == 'usageparameters':
showUsageParameters() showUsageParameters()
return return
valid_apps = gapi.get_enum_values_minus_unspecified( valid_apps = gapi.get_enum_values_minus_unspecified(
rep._rootDesc['resources']['activities']['methods']['list'][ rep._rootDesc['resources']['activities']['methods']['list']
'parameters']['applicationName']['enum'])+['customer', 'user'] ['parameters']['applicationName']['enum']) + ['customer', 'user']
if report not in valid_apps: if report not in valid_apps:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('report',
"report", ", ".join(sorted(valid_apps)), report) ', '.join(sorted(valid_apps)),
report)
customerId = GC_Values[GC_CUSTOMER_ID] customerId = GC_Values[GC_CUSTOMER_ID]
if customerId == MY_CUSTOMER: if customerId == MY_CUSTOMER:
customerId = None customerId = None
@ -283,51 +298,53 @@ def showReport():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower() myarg = sys.argv[i].lower()
if myarg == 'date': if myarg == 'date':
tryDate = utils.get_yyyymmdd(sys.argv[i+1]) tryDate = utils.get_yyyymmdd(sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['orgunit', 'org', 'ou']: elif myarg in ['orgunit', 'org', 'ou']:
_, orgUnitId = gam.getOrgUnitId(sys.argv[i+1]) _, orgUnitId = gam.getOrgUnitId(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'fulldatarequired': elif myarg == 'fulldatarequired':
fullDataRequired = [] fullDataRequired = []
fdr = sys.argv[i+1].lower() fdr = sys.argv[i + 1].lower()
if fdr and fdr != 'all': if fdr and fdr != 'all':
fullDataRequired = fdr.replace(',', ' ').split() fullDataRequired = fdr.replace(',', ' ').split()
i += 2 i += 2
elif myarg == 'start': elif myarg == 'start':
startTime = utils.get_time_or_delta_from_now(sys.argv[i+1]) startTime = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'end': elif myarg == 'end':
endTime = utils.get_time_or_delta_from_now(sys.argv[i+1]) endTime = utils.get_time_or_delta_from_now(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'event': elif myarg == 'event':
eventName = sys.argv[i+1] eventName = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'user': elif myarg == 'user':
userKey = gam.normalizeEmailAddressOrUID(sys.argv[i+1]) userKey = gam.normalizeEmailAddressOrUID(sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['filter', 'filters']: elif myarg in ['filter', 'filters']:
filters = sys.argv[i+1] filters = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['fields', 'parameters']: elif myarg in ['fields', 'parameters']:
parameters = sys.argv[i+1] parameters = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'ip': elif myarg == 'ip':
actorIpAddress = sys.argv[i+1] actorIpAddress = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'todrive': elif myarg == 'todrive':
to_drive = True to_drive = True
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam report") controlflow.invalid_argument_exit(sys.argv[i], 'gam report')
if report == 'user': if report == 'user':
while True: while True:
try: try:
if fullDataRequired is not None: if fullDataRequired is not None:
warnings = gapi.get_items(rep.userUsageReport(), 'get', warnings = gapi.get_items(rep.userUsageReport(),
'get',
'warnings', 'warnings',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
date=tryDate, userKey=userKey, date=tryDate,
userKey=userKey,
customerId=customerId, customerId=customerId,
orgUnitID=orgUnitId, orgUnitID=orgUnitId,
fields='warnings') fields='warnings')
@ -339,11 +356,13 @@ def showReport():
if fullData == 0: if fullData == 0:
continue continue
page_message = gapi.got_total_items_msg('Users', '...\n') page_message = gapi.got_total_items_msg('Users', '...\n')
usage = gapi.get_all_pages(rep.userUsageReport(), 'get', usage = gapi.get_all_pages(rep.userUsageReport(),
'get',
'usageReports', 'usageReports',
page_message=page_message, page_message=page_message,
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
date=tryDate, userKey=userKey, date=tryDate,
userKey=userKey,
customerId=customerId, customerId=customerId,
orgUnitID=orgUnitId, orgUnitID=orgUnitId,
filters=filters, filters=filters,
@ -359,8 +378,7 @@ def showReport():
for user_report in usage: for user_report in usage:
if 'entity' not in user_report: if 'entity' not in user_report:
continue continue
row = {'email': user_report['entity'] row = {'email': user_report['entity']['userEmail'], 'date': tryDate}
['userEmail'], 'date': tryDate}
for item in user_report.get('parameters', []): for item in user_report.get('parameters', []):
if 'name' not in item: if 'name' not in item:
continue continue
@ -374,14 +392,15 @@ def showReport():
else: else:
row[name] = '' row[name] = ''
csvRows.append(row) csvRows.append(row)
display.write_csv_file( display.write_csv_file(csvRows, titles, f'User Reports - {tryDate}',
csvRows, titles, f'User Reports - {tryDate}', to_drive) to_drive)
elif report == 'customer': elif report == 'customer':
while True: while True:
try: try:
if fullDataRequired is not None: if fullDataRequired is not None:
warnings = gapi.get_items(rep.customerUsageReports(), warnings = gapi.get_items(rep.customerUsageReports(),
'get', 'warnings', 'get',
'warnings',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
customerId=customerId, customerId=customerId,
date=tryDate, date=tryDate,
@ -393,7 +412,8 @@ def showReport():
sys.exit(1) sys.exit(1)
if fullData == 0: if fullData == 0:
continue continue
usage = gapi.get_all_pages(rep.customerUsageReports(), 'get', usage = gapi.get_all_pages(rep.customerUsageReports(),
'get',
'usageReports', 'usageReports',
throw_reasons=throw_reasons, throw_reasons=throw_reasons,
customerId=customerId, customerId=customerId,
@ -442,27 +462,32 @@ def showReport():
value = ' '.join(values) value = ' '.join(values)
elif 'version_number' in subitem \ elif 'version_number' in subitem \
and 'num_devices' in subitem: and 'num_devices' in subitem:
values.append( values.append(f'{subitem["version_number"]}:'
f'{subitem["version_number"]}:' f'{subitem["num_devices"]}')
f'{subitem["num_devices"]}')
else: else:
continue continue
value = ' '.join(sorted(values, reverse=True)) value = ' '.join(sorted(values, reverse=True))
csvRows.append({'name': name, 'value': value}) csvRows.append({'name': name, 'value': value})
for app in auth_apps: # put apps at bottom for app in auth_apps: # put apps at bottom
csvRows.append(app) csvRows.append(app)
display.write_csv_file( display.write_csv_file(csvRows,
csvRows, titles, f'Customer Report - {tryDate}', todrive=to_drive) titles,
f'Customer Report - {tryDate}',
todrive=to_drive)
else: else:
page_message = gapi.got_total_items_msg('Activities', '...\n') page_message = gapi.got_total_items_msg('Activities', '...\n')
activities = gapi.get_all_pages(rep.activities(), 'list', 'items', activities = gapi.get_all_pages(rep.activities(),
'list',
'items',
page_message=page_message, page_message=page_message,
applicationName=report, applicationName=report,
userKey=userKey, userKey=userKey,
customerId=customerId, customerId=customerId,
actorIpAddress=actorIpAddress, actorIpAddress=actorIpAddress,
startTime=startTime, endTime=endTime, startTime=startTime,
eventName=eventName, filters=filters, endTime=endTime,
eventName=eventName,
filters=filters,
orgUnitID=orgUnitId) orgUnitID=orgUnitId)
if activities: if activities:
titles = ['name'] titles = ['name']
@ -495,10 +520,11 @@ def showReport():
parts = {} parts = {}
for message in item['multiMessageValue']: for message in item['multiMessageValue']:
for mess in message['parameter']: for mess in message['parameter']:
value = mess.get('value', ' '.join( value = mess.get(
mess.get('multiValue', []))) 'value',
' '.join(mess.get('multiValue', [])))
parts[mess['name']] = parts.get( parts[mess['name']] = parts.get(
mess['name'], [])+[value] mess['name'], []) + [value]
for part, v in parts.items(): for part, v in parts.items():
if part == 'scope_name': if part == 'scope_name':
part = 'scope' part = 'scope'
@ -513,15 +539,18 @@ def showReport():
if item not in titles: if item not in titles:
titles.append(item) titles.append(item)
csvRows.append(row) csvRows.append(row)
display.sort_csv_titles(['name', ], titles) display.sort_csv_titles([
display.write_csv_file( 'name',
csvRows, titles, f'{report.capitalize()} Activity Report', ], titles)
to_drive) display.write_csv_file(csvRows, titles,
f'{report.capitalize()} Activity Report',
to_drive)
def _adjust_date(errMsg): def _adjust_date(errMsg):
match_date = re.match('Data for dates later than (.*) is not yet ' match_date = re.match(
'available. Please check back later', errMsg) 'Data for dates later than (.*) is not yet '
'available. Please check back later', errMsg)
if not match_date: if not match_date:
match_date = re.match('Start date can not be later than (.*)', errMsg) match_date = re.match('Start date can not be later than (.*)', errMsg)
if not match_date: if not match_date:

View File

@ -16,7 +16,10 @@ def build_gapi():
return gam.buildGAPIObject('storage') return gam.buildGAPIObject('storage')
def get_cloud_storage_object(s, bucket, object_, local_file=None, def get_cloud_storage_object(s,
bucket,
object_,
local_file=None,
expectedMd5=None): expectedMd5=None):
if not local_file: if not local_file:
local_file = object_ local_file = object_
@ -60,13 +63,19 @@ def download_bucket():
s = build_gapi() s = build_gapi()
page_message = gapi.got_total_items_msg('Files', '...') page_message = gapi.got_total_items_msg('Files', '...')
fields = 'nextPageToken,items(name,id,md5Hash)' fields = 'nextPageToken,items(name,id,md5Hash)'
objects = gapi.get_all_pages(s.objects(), 'list', 'items', objects = gapi.get_all_pages(s.objects(),
page_message=page_message, bucket=bucket, 'list',
projection='noAcl', fields=fields) 'items',
page_message=page_message,
bucket=bucket,
projection='noAcl',
fields=fields)
i = 1 i = 1
for object_ in objects: for object_ in objects:
print(f'{i}/{len(objects)}') print(f'{i}/{len(objects)}')
expectedMd5 = base64.b64decode(object_['md5Hash']).hex() expectedMd5 = base64.b64decode(object_['md5Hash']).hex()
get_cloud_storage_object( get_cloud_storage_object(s,
s, bucket, object_['name'], expectedMd5=expectedMd5) bucket,
object_['name'],
expectedMd5=expectedMd5)
i += 1 i += 1

View File

@ -23,16 +23,17 @@ def validateCollaborators(collaboratorList, cd):
for collaborator in collaboratorList.split(','): for collaborator in collaboratorList.split(','):
collaborator_id = gam.convertEmailAddressToUID(collaborator, cd) collaborator_id = gam.convertEmailAddressToUID(collaborator, cd)
if not collaborator_id: if not collaborator_id:
controlflow.system_error_exit(4, f'failed to get a UID for ' controlflow.system_error_exit(
f'{collaborator}. Please make ' 4, f'failed to get a UID for '
f'sure this is a real user.') f'{collaborator}. Please make '
f'sure this is a real user.')
collaborators.append({'email': collaborator, 'id': collaborator_id}) collaborators.append({'email': collaborator, 'id': collaborator_id})
return collaborators return collaborators
def createMatter(): def createMatter():
v = buildGAPIObject() v = buildGAPIObject()
matter_time = datetime.datetime.now().strftime("%Y-%m-%d %H:%M:%S") matter_time = datetime.datetime.now().strftime('%Y-%m-%d %H:%M:%S')
body = {'name': f'New Matter - {matter_time}'} body = {'name': f'New Matter - {matter_time}'}
collaborators = [] collaborators = []
cd = None cd = None
@ -40,26 +41,29 @@ def createMatter():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'name': if myarg == 'name':
body['name'] = sys.argv[i+1] body['name'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'description': elif myarg == 'description':
body['description'] = sys.argv[i+1] body['description'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['collaborator', 'collaborators']: elif myarg in ['collaborator', 'collaborators']:
if not cd: if not cd:
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
collaborators.extend(validateCollaborators(sys.argv[i+1], cd)) collaborators.extend(validateCollaborators(sys.argv[i + 1], cd))
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam create matter") controlflow.invalid_argument_exit(sys.argv[i], 'gam create matter')
matterId = gapi.call(v.matters(), 'create', body=body, matterId = gapi.call(v.matters(), 'create', body=body,
fields='matterId')['matterId'] fields='matterId')['matterId']
print(f'Created matter {matterId}') print(f'Created matter {matterId}')
for collaborator in collaborators: for collaborator in collaborators:
print(f' adding collaborator {collaborator["email"]}') print(f' adding collaborator {collaborator["email"]}')
body = {'matterPermission': { body = {
'role': 'COLLABORATOR', 'matterPermission': {
'accountId': collaborator['id']}} 'role': 'COLLABORATOR',
'accountId': collaborator['id']
}
}
gapi.call(v.matters(), 'addPermissions', matterId=matterId, body=body) gapi.call(v.matters(), 'addPermissions', matterId=matterId, body=body)
@ -77,8 +81,9 @@ VAULT_SEARCH_METHODS_MAP = {
'teamdrive': 'SHARED_DRIVE', 'teamdrive': 'SHARED_DRIVE',
'teamdrives': 'SHARED_DRIVE', 'teamdrives': 'SHARED_DRIVE',
} }
VAULT_SEARCH_METHODS_LIST = ['accounts', VAULT_SEARCH_METHODS_LIST = [
'orgunit', 'shareddrives', 'rooms', 'everyone'] 'accounts', 'orgunit', 'shareddrives', 'rooms', 'everyone'
]
def createExport(): def createExport():
@ -98,17 +103,18 @@ def createExport():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'matter': if myarg == 'matter':
matterId = getMatterItem(v, sys.argv[i+1]) matterId = getMatterItem(v, sys.argv[i + 1])
body['matterId'] = matterId body['matterId'] = matterId
i += 2 i += 2
elif myarg == 'name': elif myarg == 'name':
body['name'] = sys.argv[i+1] body['name'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'corpus': elif myarg == 'corpus':
body['query']['corpus'] = sys.argv[i+1].upper() body['query']['corpus'] = sys.argv[i + 1].upper()
if body['query']['corpus'] not in allowed_corpuses: if body['query']['corpus'] not in allowed_corpuses:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('corpus',
"corpus", ", ".join(allowed_corpuses), sys.argv[i+1]) ', '.join(allowed_corpuses),
sys.argv[i + 1])
i += 2 i += 2
elif myarg in VAULT_SEARCH_METHODS_MAP: elif myarg in VAULT_SEARCH_METHODS_MAP:
if body['query'].get('searchMethod'): if body['query'].get('searchMethod'):
@ -120,82 +126,93 @@ def createExport():
body['query']['searchMethod'] = searchMethod body['query']['searchMethod'] = searchMethod
if searchMethod == 'ACCOUNT': if searchMethod == 'ACCOUNT':
body['query']['accountInfo'] = { body['query']['accountInfo'] = {
'emails': sys.argv[i+1].split(',')} 'emails': sys.argv[i + 1].split(',')
}
i += 2 i += 2
elif searchMethod == 'ORG_UNIT': elif searchMethod == 'ORG_UNIT':
body['query']['orgUnitInfo'] = { body['query']['orgUnitInfo'] = {
'orgUnitId': gam.getOrgUnitId(sys.argv[i+1])[1]} 'orgUnitId': gam.getOrgUnitId(sys.argv[i + 1])[1]
}
i += 2 i += 2
elif searchMethod == 'SHARED_DRIVE': elif searchMethod == 'SHARED_DRIVE':
body['query']['sharedDriveInfo'] = { body['query']['sharedDriveInfo'] = {
'sharedDriveIds': sys.argv[i+1].split(',')} 'sharedDriveIds': sys.argv[i + 1].split(',')
}
i += 2 i += 2
elif searchMethod == 'ROOM': elif searchMethod == 'ROOM':
body['query']['hangoutsChatInfo'] = { body['query']['hangoutsChatInfo'] = {
'roomId': sys.argv[i+1].split(',')} 'roomId': sys.argv[i + 1].split(',')
}
i += 2 i += 2
else: else:
i += 1 i += 1
elif myarg == 'scope': elif myarg == 'scope':
body['query']['dataScope'] = sys.argv[i+1].upper() body['query']['dataScope'] = sys.argv[i + 1].upper()
if body['query']['dataScope'] not in allowed_scopes: if body['query']['dataScope'] not in allowed_scopes:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('scope',
"scope", ", ".join(allowed_scopes), sys.argv[i+1]) ', '.join(allowed_scopes),
sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['terms']: elif myarg in ['terms']:
body['query']['terms'] = sys.argv[i+1] body['query']['terms'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['start', 'starttime']: elif myarg in ['start', 'starttime']:
body['query']['startTime'] = utils.get_date_zero_time_or_full_time( body['query']['startTime'] = utils.get_date_zero_time_or_full_time(
sys.argv[i+1]) sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['end', 'endtime']: elif myarg in ['end', 'endtime']:
body['query']['endTime'] = utils.get_date_zero_time_or_full_time( body['query']['endTime'] = utils.get_date_zero_time_or_full_time(
sys.argv[i+1]) sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['timezone']: elif myarg in ['timezone']:
body['query']['timeZone'] = sys.argv[i+1] body['query']['timeZone'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['excludedrafts']: elif myarg in ['excludedrafts']:
body['query']['mailOptions'] = { body['query']['mailOptions'] = {
'excludeDrafts': gam.getBoolean(sys.argv[i+1], myarg)} 'excludeDrafts': gam.getBoolean(sys.argv[i + 1], myarg)
}
i += 2 i += 2
elif myarg in ['driveversiondate']: elif myarg in ['driveversiondate']:
body['query'].setdefault('driveOptions', {})['versionDate'] = \ body['query'].setdefault('driveOptions', {})['versionDate'] = \
utils.get_date_zero_time_or_full_time(sys.argv[i+1]) utils.get_date_zero_time_or_full_time(sys.argv[i+1])
i += 2 i += 2
elif myarg in ['includeshareddrives', 'includeteamdrives']: elif myarg in ['includeshareddrives', 'includeteamdrives']:
body['query'].setdefault('driveOptions', {})[ body['query'].setdefault(
'includeSharedDrives'] = gam.getBoolean(sys.argv[i+1], myarg) 'driveOptions', {})['includeSharedDrives'] = gam.getBoolean(
sys.argv[i + 1], myarg)
i += 2 i += 2
elif myarg in ['includerooms']: elif myarg in ['includerooms']:
body['query']['hangoutsChatOptions'] = { body['query']['hangoutsChatOptions'] = {
'includeRooms': gam.getBoolean(sys.argv[i+1], myarg)} 'includeRooms': gam.getBoolean(sys.argv[i + 1], myarg)
}
i += 2 i += 2
elif myarg in ['format']: elif myarg in ['format']:
export_format = sys.argv[i+1].upper() export_format = sys.argv[i + 1].upper()
if export_format not in allowed_formats: if export_format not in allowed_formats:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('export format',
"export format", ", ".join(allowed_formats), export_format) ', '.join(allowed_formats),
export_format)
i += 2 i += 2
elif myarg in ['showconfidentialmodecontent']: elif myarg in ['showconfidentialmodecontent']:
showConfidentialModeContent = gam.getBoolean(sys.argv[i+1], myarg) showConfidentialModeContent = gam.getBoolean(sys.argv[i + 1], myarg)
i += 2 i += 2
elif myarg in ['region']: elif myarg in ['region']:
allowed_regions = gapi.get_enum_values_minus_unspecified( allowed_regions = gapi.get_enum_values_minus_unspecified(
v._rootDesc['schemas']['ExportOptions']['properties'][ v._rootDesc['schemas']['ExportOptions']['properties']['region']
'region']['enum']) ['enum'])
body['exportOptions']['region'] = sys.argv[i+1].upper() body['exportOptions']['region'] = sys.argv[i + 1].upper()
if body['exportOptions']['region'] not in allowed_regions: if body['exportOptions']['region'] not in allowed_regions:
controlflow.expected_argument_exit("region", ", ".join( controlflow.expected_argument_exit(
allowed_regions), body['exportOptions']['region']) 'region', ', '.join(allowed_regions),
body['exportOptions']['region'])
i += 2 i += 2
elif myarg in ['includeaccessinfo']: elif myarg in ['includeaccessinfo']:
body['exportOptions'].setdefault('driveOptions', {})[ body['exportOptions'].setdefault(
'includeAccessInfo'] = gam.getBoolean(sys.argv[i+1], myarg) 'driveOptions', {})['includeAccessInfo'] = gam.getBoolean(
sys.argv[i + 1], myarg)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam create export") controlflow.invalid_argument_exit(sys.argv[i], 'gam create export')
if not matterId: if not matterId:
controlflow.system_error_exit( controlflow.system_error_exit(
3, 'you must specify a matter for the new export.') 3, 'you must specify a matter for the new export.')
@ -207,7 +224,7 @@ def createExport():
'for the new export. Choose one of ' \ 'for the new export. Choose one of ' \
f'{", ".join(VAULT_SEARCH_METHODS_LIST)}') f'{", ".join(VAULT_SEARCH_METHODS_LIST)}')
if 'name' not in body: if 'name' not in body:
corpus_name = body["query"]["corpus"] corpus_name = body['query']['corpus']
corpus_date = datetime.datetime.now() corpus_date = datetime.datetime.now()
body['name'] = f'GAM {corpus_name} export - {corpus_date}' body['name'] = f'GAM {corpus_name} export - {corpus_date}'
options_field = None options_field = None
@ -223,8 +240,10 @@ def createExport():
if showConfidentialModeContent is not None: if showConfidentialModeContent is not None:
body['exportOptions'][options_field][ body['exportOptions'][options_field][
'showConfidentialModeContent'] = showConfidentialModeContent 'showConfidentialModeContent'] = showConfidentialModeContent
results = gapi.call(v.matters().exports(), 'create', results = gapi.call(v.matters().exports(),
matterId=matterId, body=body) 'create',
matterId=matterId,
body=body)
print(f'Created export {results["id"]}') print(f'Created export {results["id"]}')
display.print_json(results) display.print_json(results)
@ -234,16 +253,20 @@ def deleteExport():
matterId = getMatterItem(v, sys.argv[3]) matterId = getMatterItem(v, sys.argv[3])
exportId = convertExportNameToID(v, sys.argv[4], matterId) exportId = convertExportNameToID(v, sys.argv[4], matterId)
print(f'Deleting export {sys.argv[4]} / {exportId}') print(f'Deleting export {sys.argv[4]} / {exportId}')
gapi.call(v.matters().exports(), 'delete', gapi.call(v.matters().exports(),
matterId=matterId, exportId=exportId) 'delete',
matterId=matterId,
exportId=exportId)
def getExportInfo(): def getExportInfo():
v = buildGAPIObject() v = buildGAPIObject()
matterId = getMatterItem(v, sys.argv[3]) matterId = getMatterItem(v, sys.argv[3])
exportId = convertExportNameToID(v, sys.argv[4], matterId) exportId = convertExportNameToID(v, sys.argv[4], matterId)
export = gapi.call(v.matters().exports(), 'get', export = gapi.call(v.matters().exports(),
matterId=matterId, exportId=exportId) 'get',
matterId=matterId,
exportId=exportId)
display.print_json(export) display.print_json(export)
@ -261,35 +284,37 @@ def createHold():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'name': if myarg == 'name':
body['name'] = sys.argv[i+1] body['name'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'query': elif myarg == 'query':
query = sys.argv[i+1] query = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'corpus': elif myarg == 'corpus':
body['corpus'] = sys.argv[i+1].upper() body['corpus'] = sys.argv[i + 1].upper()
if body['corpus'] not in allowed_corpuses: if body['corpus'] not in allowed_corpuses:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('corpus',
"corpus", ", ".join(allowed_corpuses), sys.argv[i+1]) ', '.join(allowed_corpuses),
sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['accounts', 'users', 'groups']: elif myarg in ['accounts', 'users', 'groups']:
accounts = sys.argv[i+1].split(',') accounts = sys.argv[i + 1].split(',')
i += 2 i += 2
elif myarg in ['orgunit', 'ou']: elif myarg in ['orgunit', 'ou']:
body['orgUnit'] = { body['orgUnit'] = {
'orgUnitId': gam.getOrgUnitId(sys.argv[i+1])[1]} 'orgUnitId': gam.getOrgUnitId(sys.argv[i + 1])[1]
}
i += 2 i += 2
elif myarg in ['start', 'starttime']: elif myarg in ['start', 'starttime']:
start_time = utils.get_date_zero_time_or_full_time(sys.argv[i+1]) start_time = utils.get_date_zero_time_or_full_time(sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['end', 'endtime']: elif myarg in ['end', 'endtime']:
end_time = utils.get_date_zero_time_or_full_time(sys.argv[i+1]) end_time = utils.get_date_zero_time_or_full_time(sys.argv[i + 1])
i += 2 i += 2
elif myarg == 'matter': elif myarg == 'matter':
matterId = getMatterItem(v, sys.argv[i+1]) matterId = getMatterItem(v, sys.argv[i + 1])
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam create hold") controlflow.invalid_argument_exit(sys.argv[i], 'gam create hold')
if not matterId: if not matterId:
controlflow.system_error_exit( controlflow.system_error_exit(
3, 'you must specify a matter for the new hold.') 3, 'you must specify a matter for the new hold.')
@ -322,13 +347,15 @@ def createHold():
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
account_type = 'group' if body['corpus'] == 'GROUPS' else 'user' account_type = 'group' if body['corpus'] == 'GROUPS' else 'user'
for account in accounts: for account in accounts:
body['accounts'].append( body['accounts'].append({
{'accountId': gam.convertEmailAddressToUID(account, 'accountId':
cd, gam.convertEmailAddressToUID(account, cd, account_type)
account_type)} })
) holdId = gapi.call(v.matters().holds(),
holdId = gapi.call(v.matters().holds(), 'create', 'create',
matterId=matterId, body=body, fields='holdId') matterId=matterId,
body=body,
fields='holdId')
print(f'Created hold {holdId["holdId"]}') print(f'Created hold {holdId["holdId"]}')
@ -340,11 +367,11 @@ def deleteHold():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'matter': if myarg == 'matter':
matterId = getMatterItem(v, sys.argv[i+1]) matterId = getMatterItem(v, sys.argv[i + 1])
holdId = convertHoldNameToID(v, hold, matterId) holdId = convertHoldNameToID(v, hold, matterId)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam delete hold") controlflow.invalid_argument_exit(myarg, 'gam delete hold')
if not matterId: if not matterId:
controlflow.system_error_exit( controlflow.system_error_exit(
3, 'you must specify a matter for the hold.') 3, 'you must specify a matter for the hold.')
@ -360,23 +387,24 @@ def getHoldInfo():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'matter': if myarg == 'matter':
matterId = getMatterItem(v, sys.argv[i+1]) matterId = getMatterItem(v, sys.argv[i + 1])
holdId = convertHoldNameToID(v, hold, matterId) holdId = convertHoldNameToID(v, hold, matterId)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam info hold") controlflow.invalid_argument_exit(myarg, 'gam info hold')
if not matterId: if not matterId:
controlflow.system_error_exit( controlflow.system_error_exit(
3, 'you must specify a matter for the hold.') 3, 'you must specify a matter for the hold.')
results = gapi.call(v.matters().holds(), 'get', results = gapi.call(v.matters().holds(),
matterId=matterId, holdId=holdId) 'get',
matterId=matterId,
holdId=holdId)
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
if 'accounts' in results: if 'accounts' in results:
account_type = 'group' if results['corpus'] == 'GROUPS' else 'user' account_type = 'group' if results['corpus'] == 'GROUPS' else 'user'
for i in range(0, len(results['accounts'])): for i in range(0, len(results['accounts'])):
uid = f'uid:{results["accounts"][i]["accountId"]}' uid = f'uid:{results["accounts"][i]["accountId"]}'
acct_email = gam.convertUIDtoEmailAddress( acct_email = gam.convertUIDtoEmailAddress(uid, cd, [account_type])
uid, cd, [account_type])
results['accounts'][i]['email'] = acct_email results['accounts'][i]['email'] = acct_email
if 'orgUnit' in results: if 'orgUnit' in results:
results['orgUnit']['orgUnitPath'] = gam.doGetOrgInfo( results['orgUnit']['orgUnitPath'] = gam.doGetOrgInfo(
@ -390,13 +418,17 @@ def convertExportNameToID(v, nameOrID, matterId):
if cg: if cg:
return cg.group(1) return cg.group(1)
fields = 'exports(id,name),nextPageToken' fields = 'exports(id,name),nextPageToken'
exports = gapi.get_all_pages(v.matters().exports( exports = gapi.get_all_pages(v.matters().exports(),
), 'list', 'exports', matterId=matterId, fields=fields) 'list',
'exports',
matterId=matterId,
fields=fields)
for export in exports: for export in exports:
if export['name'].lower() == nameOrID: if export['name'].lower() == nameOrID:
return export['id'] return export['id']
controlflow.system_error_exit(4, f'could not find export name {nameOrID} ' controlflow.system_error_exit(
f'in matter {matterId}') 4, f'could not find export name {nameOrID} '
f'in matter {matterId}')
def convertHoldNameToID(v, nameOrID, matterId): def convertHoldNameToID(v, nameOrID, matterId):
@ -405,13 +437,17 @@ def convertHoldNameToID(v, nameOrID, matterId):
if cg: if cg:
return cg.group(1) return cg.group(1)
fields = 'holds(holdId,name),nextPageToken' fields = 'holds(holdId,name),nextPageToken'
holds = gapi.get_all_pages(v.matters().holds( holds = gapi.get_all_pages(v.matters().holds(),
), 'list', 'holds', matterId=matterId, fields=fields) 'list',
'holds',
matterId=matterId,
fields=fields)
for hold in holds: for hold in holds:
if hold['name'].lower() == nameOrID: if hold['name'].lower() == nameOrID:
return hold['holdId'] return hold['holdId']
controlflow.system_error_exit(4, f'could not find hold name {nameOrID} ' controlflow.system_error_exit(
f'in matter {matterId}') 4, f'could not find hold name {nameOrID} '
f'in matter {matterId}')
def convertMatterNameToID(v, nameOrID): def convertMatterNameToID(v, nameOrID):
@ -420,8 +456,11 @@ def convertMatterNameToID(v, nameOrID):
if cg: if cg:
return cg.group(1) return cg.group(1)
fields = 'matters(matterId,name),nextPageToken' fields = 'matters(matterId,name),nextPageToken'
matters = gapi.get_all_pages(v.matters( matters = gapi.get_all_pages(v.matters(),
), 'list', 'matters', view='BASIC', fields=fields) 'list',
'matters',
view='BASIC',
fields=fields)
for matter in matters: for matter in matters:
if matter['name'].lower() == nameOrID: if matter['name'].lower() == nameOrID:
return matter['matterId'] return matter['matterId']
@ -449,36 +488,41 @@ def updateHold():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'matter': if myarg == 'matter':
matterId = getMatterItem(v, sys.argv[i+1]) matterId = getMatterItem(v, sys.argv[i + 1])
holdId = convertHoldNameToID(v, hold, matterId) holdId = convertHoldNameToID(v, hold, matterId)
i += 2 i += 2
elif myarg == 'query': elif myarg == 'query':
query = sys.argv[i+1] query = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['orgunit', 'ou']: elif myarg in ['orgunit', 'ou']:
body['orgUnit'] = {'orgUnitId': gam.getOrgUnitId(sys.argv[i+1])[1]} body['orgUnit'] = {
'orgUnitId': gam.getOrgUnitId(sys.argv[i + 1])[1]
}
i += 2 i += 2
elif myarg in ['start', 'starttime']: elif myarg in ['start', 'starttime']:
start_time = utils.get_date_zero_time_or_full_time(sys.argv[i+1]) start_time = utils.get_date_zero_time_or_full_time(sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['end', 'endtime']: elif myarg in ['end', 'endtime']:
end_time = utils.get_date_zero_time_or_full_time(sys.argv[i+1]) end_time = utils.get_date_zero_time_or_full_time(sys.argv[i + 1])
i += 2 i += 2
elif myarg in ['addusers', 'addaccounts', 'addgroups']: elif myarg in ['addusers', 'addaccounts', 'addgroups']:
add_accounts = sys.argv[i+1].split(',') add_accounts = sys.argv[i + 1].split(',')
i += 2 i += 2
elif myarg in ['removeusers', 'removeaccounts', 'removegroups']: elif myarg in ['removeusers', 'removeaccounts', 'removegroups']:
del_accounts = sys.argv[i+1].split(',') del_accounts = sys.argv[i + 1].split(',')
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam update hold") controlflow.invalid_argument_exit(myarg, 'gam update hold')
if not matterId: if not matterId:
controlflow.system_error_exit( controlflow.system_error_exit(
3, 'you must specify a matter for the hold.') 3, 'you must specify a matter for the hold.')
if query or start_time or end_time or body.get('orgUnit'): if query or start_time or end_time or body.get('orgUnit'):
fields = 'corpus,query,orgUnit' fields = 'corpus,query,orgUnit'
old_body = gapi.call(v.matters().holds( old_body = gapi.call(v.matters().holds(),
), 'get', matterId=matterId, holdId=holdId, fields=fields) 'get',
matterId=matterId,
holdId=holdId,
fields=fields)
body['query'] = old_body['query'] body['query'] = old_body['query']
body['corpus'] = old_body['corpus'] body['corpus'] = old_body['corpus']
if 'orgUnit' in old_body and 'orgUnit' not in body: if 'orgUnit' in old_body and 'orgUnit' not in body:
@ -502,20 +546,29 @@ def updateHold():
body['query'][query_type]['endTime'] = end_time body['query'][query_type]['endTime'] = end_time
if body: if body:
print(f'Updating hold {hold} / {holdId}') print(f'Updating hold {hold} / {holdId}')
gapi.call(v.matters().holds(), 'update', gapi.call(v.matters().holds(),
matterId=matterId, holdId=holdId, body=body) 'update',
matterId=matterId,
holdId=holdId,
body=body)
if add_accounts or del_accounts: if add_accounts or del_accounts:
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
for account in add_accounts: for account in add_accounts:
print(f'adding {account} to hold.') print(f'adding {account} to hold.')
add_body = {'accountId': gam.convertEmailAddressToUID(account, cd)} add_body = {'accountId': gam.convertEmailAddressToUID(account, cd)}
gapi.call(v.matters().holds().accounts(), 'create', gapi.call(v.matters().holds().accounts(),
matterId=matterId, holdId=holdId, body=add_body) 'create',
matterId=matterId,
holdId=holdId,
body=add_body)
for account in del_accounts: for account in del_accounts:
print(f'removing {account} from hold.') print(f'removing {account} from hold.')
accountId = gam.convertEmailAddressToUID(account, cd) accountId = gam.convertEmailAddressToUID(account, cd)
gapi.call(v.matters().holds().accounts(), 'delete', gapi.call(v.matters().holds().accounts(),
matterId=matterId, holdId=holdId, accountId=accountId) 'delete',
matterId=matterId,
holdId=holdId,
accountId=accountId)
def updateMatter(action=None): def updateMatter(action=None):
@ -530,30 +583,30 @@ def updateMatter(action=None):
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'action': if myarg == 'action':
action = sys.argv[i+1].lower() action = sys.argv[i + 1].lower()
if action not in VAULT_MATTER_ACTIONS: if action not in VAULT_MATTER_ACTIONS:
controlflow.system_error_exit(3, f'allowed actions are ' \ controlflow.system_error_exit(3, f'allowed actions are ' \
f'{", ".join(VAULT_MATTER_ACTIONS)}, got {action}') f'{", ".join(VAULT_MATTER_ACTIONS)}, got {action}')
i += 2 i += 2
elif myarg == 'name': elif myarg == 'name':
body['name'] = sys.argv[i+1] body['name'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg == 'description': elif myarg == 'description':
body['description'] = sys.argv[i+1] body['description'] = sys.argv[i + 1]
i += 2 i += 2
elif myarg in ['addcollaborator', 'addcollaborators']: elif myarg in ['addcollaborator', 'addcollaborators']:
if not cd: if not cd:
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
add_collaborators.extend(validateCollaborators(sys.argv[i+1], cd)) add_collaborators.extend(validateCollaborators(sys.argv[i + 1], cd))
i += 2 i += 2
elif myarg in ['removecollaborator', 'removecollaborators']: elif myarg in ['removecollaborator', 'removecollaborators']:
if not cd: if not cd:
cd = gam.buildGAPIObject('directory') cd = gam.buildGAPIObject('directory')
remove_collaborators.extend( remove_collaborators.extend(
validateCollaborators(sys.argv[i+1], cd)) validateCollaborators(sys.argv[i + 1], cd))
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(sys.argv[i], "gam update matter") controlflow.invalid_argument_exit(sys.argv[i], 'gam update matter')
if action == 'delete': if action == 'delete':
action_kwargs = {} action_kwargs = {}
if body: if body:
@ -561,8 +614,10 @@ def updateMatter(action=None):
if 'name' not in body or 'description' not in body: if 'name' not in body or 'description' not in body:
# bah, API requires name/description to be sent # bah, API requires name/description to be sent
# on update even when it's not changing # on update even when it's not changing
result = gapi.call(v.matters(), 'get', result = gapi.call(v.matters(),
matterId=matterId, view='BASIC') 'get',
matterId=matterId,
view='BASIC')
body.setdefault('name', result['name']) body.setdefault('name', result['name'])
body.setdefault('description', result.get('description')) body.setdefault('description', result.get('description'))
gapi.call(v.matters(), 'update', body=body, matterId=matterId) gapi.call(v.matters(), 'update', body=body, matterId=matterId)
@ -571,12 +626,18 @@ def updateMatter(action=None):
gapi.call(v.matters(), action, matterId=matterId, **action_kwargs) gapi.call(v.matters(), action, matterId=matterId, **action_kwargs)
for collaborator in add_collaborators: for collaborator in add_collaborators:
print(f' adding collaborator {collaborator["email"]}') print(f' adding collaborator {collaborator["email"]}')
body = {'matterPermission': {'role': 'COLLABORATOR', body = {
'accountId': collaborator['id']}} 'matterPermission': {
'role': 'COLLABORATOR',
'accountId': collaborator['id']
}
}
gapi.call(v.matters(), 'addPermissions', matterId=matterId, body=body) gapi.call(v.matters(), 'addPermissions', matterId=matterId, body=body)
for collaborator in remove_collaborators: for collaborator in remove_collaborators:
print(f' removing collaborator {collaborator["email"]}') print(f' removing collaborator {collaborator["email"]}')
gapi.call(v.matters(), 'removePermissions', matterId=matterId, gapi.call(v.matters(),
'removePermissions',
matterId=matterId,
body={'accountId': collaborator['id']}) body={'accountId': collaborator['id']})
@ -605,7 +666,7 @@ def downloadExport():
while i < len(sys.argv): while i < len(sys.argv):
myarg = sys.argv[i].lower().replace('_', '') myarg = sys.argv[i].lower().replace('_', '')
if myarg == 'targetfolder': if myarg == 'targetfolder':
targetFolder = os.path.expanduser(sys.argv[i+1]) targetFolder = os.path.expanduser(sys.argv[i + 1])
if not os.path.isdir(targetFolder): if not os.path.isdir(targetFolder):
os.makedirs(targetFolder) os.makedirs(targetFolder)
i += 2 i += 2
@ -616,10 +677,12 @@ def downloadExport():
extractFiles = False extractFiles = False
i += 1 i += 1
else: else:
controlflow.invalid_argument_exit( controlflow.invalid_argument_exit(sys.argv[i],
sys.argv[i], "gam download export") 'gam download export')
export = gapi.call(v.matters().exports(), 'get', export = gapi.call(v.matters().exports(),
matterId=matterId, exportId=exportId) 'get',
matterId=matterId,
exportId=exportId)
for s_file in export['cloudStorageSink']['files']: for s_file in export['cloudStorageSink']['files']:
bucket = s_file['bucketName'] bucket = s_file['bucketName']
s_object = s_file['objectName'] s_object = s_file['objectName']
@ -631,8 +694,8 @@ def downloadExport():
done = False done = False
while not done: while not done:
status, done = downloader.next_chunk() status, done = downloader.next_chunk()
sys.stdout.write( sys.stdout.write(' Downloaded: {0:>7.2%}\r'.format(
' Downloaded: {0:>7.2%}\r'.format(status.progress())) status.progress()))
sys.stdout.flush() sys.stdout.flush()
sys.stdout.write('\n Download complete. Flushing to disk...\n') sys.stdout.write('\n Download complete. Flushing to disk...\n')
fileutils.close_file(f, True) fileutils.close_file(f, True)
@ -665,23 +728,26 @@ def printMatters():
i += 1 i += 1
elif myarg == 'matterstate': elif myarg == 'matterstate':
valid_states = gapi.get_enum_values_minus_unspecified( valid_states = gapi.get_enum_values_minus_unspecified(
v._rootDesc['schemas']['Matter']['properties']['state'][ v._rootDesc['schemas']['Matter']['properties']['state']['enum'])
'enum']) state = sys.argv[i + 1].upper()
state = sys.argv[i+1].upper()
if state not in valid_states: if state not in valid_states:
controlflow.expected_argument_exit( controlflow.expected_argument_exit('state',
'state', ', '.join(valid_states), state) ', '.join(valid_states),
state)
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam print matters") controlflow.invalid_argument_exit(myarg, 'gam print matters')
gam.printGettingAllItems('Vault Matters', None) gam.printGettingAllItems('Vault Matters', None)
page_message = gapi.got_total_items_msg('Vault Matters', '...\n') page_message = gapi.got_total_items_msg('Vault Matters', '...\n')
matters = gapi.get_all_pages( matters = gapi.get_all_pages(v.matters(),
v.matters(), 'list', 'matters', page_message=page_message, view=view, 'list',
state=state) 'matters',
page_message=page_message,
view=view,
state=state)
for matter in matters: for matter in matters:
display.add_row_titles_to_csv_file( display.add_row_titles_to_csv_file(utils.flatten_json(matter), csvRows,
utils.flatten_json(matter), csvRows, titles) titles)
display.sort_csv_titles(initialTitles, titles) display.sort_csv_titles(initialTitles, titles)
display.write_csv_file(csvRows, titles, 'Vault Matters', todrive) display.write_csv_file(csvRows, titles, 'Vault Matters', todrive)
@ -701,14 +767,18 @@ def printExports():
todrive = True todrive = True
i += 1 i += 1
elif myarg in ['matter', 'matters']: elif myarg in ['matter', 'matters']:
matters = sys.argv[i+1].split(',') matters = sys.argv[i + 1].split(',')
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam print exports") controlflow.invalid_argument_exit(myarg, 'gam print exports')
if not matters: if not matters:
fields = 'matters(matterId),nextPageToken' fields = 'matters(matterId),nextPageToken'
matters_results = gapi.get_all_pages(v.matters( matters_results = gapi.get_all_pages(v.matters(),
), 'list', 'matters', view='BASIC', state='OPEN', fields=fields) 'list',
'matters',
view='BASIC',
state='OPEN',
fields=fields)
for matter in matters_results: for matter in matters_results:
matterIds.append(matter['matterId']) matterIds.append(matter['matterId'])
else: else:
@ -716,11 +786,14 @@ def printExports():
matterIds.append(getMatterItem(v, matter)) matterIds.append(getMatterItem(v, matter))
for matterId in matterIds: for matterId in matterIds:
sys.stderr.write(f'Retrieving exports for matter {matterId}\n') sys.stderr.write(f'Retrieving exports for matter {matterId}\n')
exports = gapi.get_all_pages( exports = gapi.get_all_pages(v.matters().exports(),
v.matters().exports(), 'list', 'exports', matterId=matterId) 'list',
'exports',
matterId=matterId)
for export in exports: for export in exports:
display.add_row_titles_to_csv_file(utils.flatten_json( display.add_row_titles_to_csv_file(
export, flattened={'matterId': matterId}), csvRows, titles) utils.flatten_json(export, flattened={'matterId': matterId}),
csvRows, titles)
display.sort_csv_titles(initialTitles, titles) display.sort_csv_titles(initialTitles, titles)
display.write_csv_file(csvRows, titles, 'Vault Exports', todrive) display.write_csv_file(csvRows, titles, 'Vault Exports', todrive)
@ -740,14 +813,18 @@ def printHolds():
todrive = True todrive = True
i += 1 i += 1
elif myarg in ['matter', 'matters']: elif myarg in ['matter', 'matters']:
matters = sys.argv[i+1].split(',') matters = sys.argv[i + 1].split(',')
i += 2 i += 2
else: else:
controlflow.invalid_argument_exit(myarg, "gam print holds") controlflow.invalid_argument_exit(myarg, 'gam print holds')
if not matters: if not matters:
fields = 'matters(matterId),nextPageToken' fields = 'matters(matterId),nextPageToken'
matters_results = gapi.get_all_pages(v.matters( matters_results = gapi.get_all_pages(v.matters(),
), 'list', 'matters', view='BASIC', state='OPEN', fields=fields) 'list',
'matters',
view='BASIC',
state='OPEN',
fields=fields)
for matter in matters_results: for matter in matters_results:
matterIds.append(matter['matterId']) matterIds.append(matter['matterId'])
else: else:
@ -755,10 +832,13 @@ def printHolds():
matterIds.append(getMatterItem(v, matter)) matterIds.append(getMatterItem(v, matter))
for matterId in matterIds: for matterId in matterIds:
sys.stderr.write(f'Retrieving holds for matter {matterId}\n') sys.stderr.write(f'Retrieving holds for matter {matterId}\n')
holds = gapi.get_all_pages( holds = gapi.get_all_pages(v.matters().holds(),
v.matters().holds(), 'list', 'holds', matterId=matterId) 'list',
'holds',
matterId=matterId)
for hold in holds: for hold in holds:
display.add_row_titles_to_csv_file(utils.flatten_json( display.add_row_titles_to_csv_file(
hold, flattened={'matterId': matterId}), csvRows, titles) utils.flatten_json(hold, flattened={'matterId': matterId}),
csvRows, titles)
display.sort_csv_titles(initialTitles, titles) display.sort_csv_titles(initialTitles, titles)
display.write_csv_file(csvRows, titles, 'Vault Holds', todrive) display.write_csv_file(csvRows, titles, 'Vault Holds', todrive)

View File

@ -14,7 +14,7 @@ def create_http(cache=None,
timeout=None, timeout=None,
override_min_tls=None, override_min_tls=None,
override_max_tls=None): override_max_tls=None):
"""Creates a uniform HTTP transport object. """Creates a uniform HTTP transport object.
Args: Args:
cache: The HTTP cache to use. cache: The HTTP cache to use.
@ -27,22 +27,21 @@ def create_http(cache=None,
Returns: Returns:
httplib2.Http with the specified options. httplib2.Http with the specified options.
""" """
tls_minimum_version = override_min_tls if override_min_tls else GC_Values.get( tls_minimum_version = override_min_tls if override_min_tls else GC_Values.get(
GC_TLS_MIN_VERSION) GC_TLS_MIN_VERSION)
tls_maximum_version = override_max_tls if override_max_tls else GC_Values.get( tls_maximum_version = override_max_tls if override_max_tls else GC_Values.get(
GC_TLS_MAX_VERSION) GC_TLS_MAX_VERSION)
httpObj = httplib2.Http( httpObj = httplib2.Http(ca_certs=GC_Values.get(GC_CA_FILE),
ca_certs=GC_Values.get(GC_CA_FILE), tls_maximum_version=tls_maximum_version,
tls_maximum_version=tls_maximum_version, tls_minimum_version=tls_minimum_version,
tls_minimum_version=tls_minimum_version, cache=cache,
cache=cache, timeout=timeout)
timeout=timeout) httpObj.redirect_codes = set(httpObj.redirect_codes) - {308}
httpObj.redirect_codes = set(httpObj.redirect_codes) - {308} return httpObj
return httpObj
def create_request(http=None): def create_request(http=None):
"""Creates a uniform Request object with a default http, if not provided. """Creates a uniform Request object with a default http, if not provided.
Args: Args:
http: Optional httplib2.Http compatible object to be used with the request. http: Optional httplib2.Http compatible object to be used with the request.
@ -51,53 +50,53 @@ def create_request(http=None):
Returns: Returns:
Request: A google_auth_httplib2.Request compatible Request. Request: A google_auth_httplib2.Request compatible Request.
""" """
if not http: if not http:
http = create_http() http = create_http()
return Request(http) return Request(http)
GAM_USER_AGENT = GAM_INFO GAM_USER_AGENT = GAM_INFO
def _force_user_agent(user_agent): def _force_user_agent(user_agent):
"""Creates a decorator which can force a user agent in HTTP headers.""" """Creates a decorator which can force a user agent in HTTP headers."""
def decorator(request_method): def decorator(request_method):
"""Wraps a request method to insert a user-agent in HTTP headers.""" """Wraps a request method to insert a user-agent in HTTP headers."""
def wrapped_request_method(*args, **kwargs): def wrapped_request_method(*args, **kwargs):
"""Modifies HTTP headers to include a specified user-agent.""" """Modifies HTTP headers to include a specified user-agent."""
if kwargs.get('headers') is not None: if kwargs.get('headers') is not None:
if kwargs['headers'].get('user-agent'): if kwargs['headers'].get('user-agent'):
if user_agent not in kwargs['headers']['user-agent']: if user_agent not in kwargs['headers']['user-agent']:
# Save the existing user-agent header and tack on our own. # Save the existing user-agent header and tack on our own.
kwargs['headers']['user-agent'] = ( kwargs['headers']['user-agent'] = (
f'{user_agent} ' f'{user_agent} '
f'{kwargs["headers"]["user-agent"]}') f'{kwargs["headers"]["user-agent"]}')
else: else:
kwargs['headers']['user-agent'] = user_agent kwargs['headers']['user-agent'] = user_agent
else: else:
kwargs['headers'] = {'user-agent': user_agent} kwargs['headers'] = {'user-agent': user_agent}
return request_method(*args, **kwargs) return request_method(*args, **kwargs)
return wrapped_request_method return wrapped_request_method
return decorator return decorator
class Request(google_auth_httplib2.Request): class Request(google_auth_httplib2.Request):
"""A Request which forces a user agent.""" """A Request which forces a user agent."""
@_force_user_agent(GAM_USER_AGENT) @_force_user_agent(GAM_USER_AGENT)
def __call__(self, *args, **kwargs): def __call__(self, *args, **kwargs):
"""Inserts the GAM user-agent header in requests.""" """Inserts the GAM user-agent header in requests."""
return super(Request, self).__call__(*args, **kwargs) return super(Request, self).__call__(*args, **kwargs)
class AuthorizedHttp(google_auth_httplib2.AuthorizedHttp): class AuthorizedHttp(google_auth_httplib2.AuthorizedHttp):
"""An AuthorizedHttp which forces a user agent during requests.""" """An AuthorizedHttp which forces a user agent during requests."""
@_force_user_agent(GAM_USER_AGENT) @_force_user_agent(GAM_USER_AGENT)
def request(self, *args, **kwargs): def request(self, *args, **kwargs):
"""Inserts the GAM user-agent header in requests.""" """Inserts the GAM user-agent header in requests."""
return super(AuthorizedHttp, self).request(*args, **kwargs) return super(AuthorizedHttp, self).request(*args, **kwargs)

View File

@ -13,167 +13,173 @@ from gam import transport
class CreateHttpTest(unittest.TestCase): class CreateHttpTest(unittest.TestCase):
def setUp(self): def setUp(self):
SetGlobalVariables() SetGlobalVariables()
super(CreateHttpTest, self).setUp() super(CreateHttpTest, self).setUp()
def test_create_http_sets_default_values_on_http(self): def test_create_http_sets_default_values_on_http(self):
http = transport.create_http() http = transport.create_http()
self.assertIsNone(http.cache) self.assertIsNone(http.cache)
self.assertIsNone(http.timeout) self.assertIsNone(http.timeout)
self.assertEqual(http.tls_minimum_version, self.assertEqual(http.tls_minimum_version,
transport.GC_Values[transport.GC_TLS_MIN_VERSION]) transport.GC_Values[transport.GC_TLS_MIN_VERSION])
self.assertEqual(http.tls_maximum_version, self.assertEqual(http.tls_maximum_version,
transport.GC_Values[transport.GC_TLS_MAX_VERSION]) transport.GC_Values[transport.GC_TLS_MAX_VERSION])
self.assertEqual(http.ca_certs, transport.GC_Values[transport.GC_CA_FILE]) self.assertEqual(http.ca_certs,
transport.GC_Values[transport.GC_CA_FILE])
def test_create_http_sets_tls_min_version(self): def test_create_http_sets_tls_min_version(self):
http = transport.create_http(override_min_tls='TLSv1_1') http = transport.create_http(override_min_tls='TLSv1_1')
self.assertEqual(http.tls_minimum_version, 'TLSv1_1') self.assertEqual(http.tls_minimum_version, 'TLSv1_1')
def test_create_http_sets_tls_max_version(self): def test_create_http_sets_tls_max_version(self):
http = transport.create_http(override_max_tls='TLSv1_3') http = transport.create_http(override_max_tls='TLSv1_3')
self.assertEqual(http.tls_maximum_version, 'TLSv1_3') self.assertEqual(http.tls_maximum_version, 'TLSv1_3')
def test_create_http_sets_cache(self): def test_create_http_sets_cache(self):
fake_cache = {} fake_cache = {}
http = transport.create_http(cache=fake_cache) http = transport.create_http(cache=fake_cache)
self.assertEqual(http.cache, fake_cache) self.assertEqual(http.cache, fake_cache)
def test_create_http_sets_cache_timeout(self): def test_create_http_sets_cache_timeout(self):
http = transport.create_http(timeout=1234) http = transport.create_http(timeout=1234)
self.assertEqual(http.timeout, 1234) self.assertEqual(http.timeout, 1234)
class TransportTest(unittest.TestCase): class TransportTest(unittest.TestCase):
def setUp(self): def setUp(self):
self.mock_http = MagicMock(spec=httplib2.Http) self.mock_http = MagicMock(spec=httplib2.Http)
self.mock_response = MagicMock(spec=httplib2.Response) self.mock_response = MagicMock(spec=httplib2.Response)
self.mock_content = MagicMock() self.mock_content = MagicMock()
self.mock_http.request.return_value = (self.mock_response, self.mock_http.request.return_value = (self.mock_response,
self.mock_content) self.mock_content)
self.mock_credentials = MagicMock() self.mock_credentials = MagicMock()
self.test_uri = 'http://example.com' self.test_uri = 'http://example.com'
super(TransportTest, self).setUp() super(TransportTest, self).setUp()
@patch.object(transport, 'create_http') @patch.object(transport, 'create_http')
def test_create_request_uses_default_http(self, mock_create_http): def test_create_request_uses_default_http(self, mock_create_http):
request = transport.create_request() request = transport.create_request()
self.assertEqual(request.http, mock_create_http.return_value) self.assertEqual(request.http, mock_create_http.return_value)
def test_create_request_uses_provided_http(self): def test_create_request_uses_provided_http(self):
request = transport.create_request(http=self.mock_http) request = transport.create_request(http=self.mock_http)
self.assertEqual(request.http, self.mock_http) self.assertEqual(request.http, self.mock_http)
def test_create_request_returns_request_with_forced_user_agent(self): def test_create_request_returns_request_with_forced_user_agent(self):
request = transport.create_request() request = transport.create_request()
self.assertIsInstance(request, transport.Request) self.assertIsInstance(request, transport.Request)
def test_request_is_google_auth_httplib2_compatible(self): def test_request_is_google_auth_httplib2_compatible(self):
request = transport.create_request() request = transport.create_request()
self.assertIsInstance(request, google_auth_httplib2.Request) self.assertIsInstance(request, google_auth_httplib2.Request)
def test_request_call_returns_response_content(self): def test_request_call_returns_response_content(self):
request = transport.Request(self.mock_http) request = transport.Request(self.mock_http)
response = request(self.test_uri) response = request(self.test_uri)
self.assertEqual(self.mock_response.status, response.status) self.assertEqual(self.mock_response.status, response.status)
self.assertEqual(self.mock_content, response.data) self.assertEqual(self.mock_content, response.data)
def test_request_call_forces_user_agent_no_provided_headers(self): def test_request_call_forces_user_agent_no_provided_headers(self):
request = transport.Request(self.mock_http) request = transport.Request(self.mock_http)
request(self.test_uri) request(self.test_uri)
headers = self.mock_http.request.call_args[1]['headers'] headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', headers) self.assertIn('user-agent', headers)
self.assertIn(transport.GAM_USER_AGENT, headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, headers['user-agent'])
def test_request_call_forces_user_agent_no_agent_in_headers(self): def test_request_call_forces_user_agent_no_agent_in_headers(self):
request = transport.Request(self.mock_http) request = transport.Request(self.mock_http)
fake_request_headers = {'some-header-thats-not-a-user-agent': 'someData'} fake_request_headers = {
'some-header-thats-not-a-user-agent': 'someData'
}
request(self.test_uri, headers=fake_request_headers) request(self.test_uri, headers=fake_request_headers)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
self.assertIn('some-header-thats-not-a-user-agent', final_headers) self.assertIn('some-header-thats-not-a-user-agent', final_headers)
self.assertEqual('someData', self.assertEqual('someData',
final_headers['some-header-thats-not-a-user-agent']) final_headers['some-header-thats-not-a-user-agent'])
def test_request_call_forces_user_agent_with_another_agent_in_headers(self): def test_request_call_forces_user_agent_with_another_agent_in_headers(self):
request = transport.Request(self.mock_http) request = transport.Request(self.mock_http)
headers_with_user_agent = {'user-agent': 'existing-user-agent'} headers_with_user_agent = {'user-agent': 'existing-user-agent'}
request(self.test_uri, headers=headers_with_user_agent) request(self.test_uri, headers=headers_with_user_agent)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn('existing-user-agent', final_headers['user-agent']) self.assertIn('existing-user-agent', final_headers['user-agent'])
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
def test_request_call_same_user_agent_already_in_headers(self): def test_request_call_same_user_agent_already_in_headers(self):
request = transport.Request(self.mock_http) request = transport.Request(self.mock_http)
same_user_agent_header = {'user-agent': transport.GAM_USER_AGENT} same_user_agent_header = {'user-agent': transport.GAM_USER_AGENT}
request(self.test_uri, headers=same_user_agent_header) request(self.test_uri, headers=same_user_agent_header)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
# Make sure the header wasn't duplicated # Make sure the header wasn't duplicated
self.assertEqual( self.assertEqual(len(transport.GAM_USER_AGENT),
len(transport.GAM_USER_AGENT), len(final_headers['user-agent'])) len(final_headers['user-agent']))
def test_authorizedhttp_is_google_auth_httplib2_compatible(self): def test_authorizedhttp_is_google_auth_httplib2_compatible(self):
http = transport.AuthorizedHttp(self.mock_credentials) http = transport.AuthorizedHttp(self.mock_credentials)
self.assertIsInstance(http, google_auth_httplib2.AuthorizedHttp) self.assertIsInstance(http, google_auth_httplib2.AuthorizedHttp)
def test_authorizedhttp_request_returns_response_content(self): def test_authorizedhttp_request_returns_response_content(self):
http = transport.AuthorizedHttp(self.mock_credentials, http=self.mock_http) http = transport.AuthorizedHttp(self.mock_credentials,
response, content = http.request(self.test_uri) http=self.mock_http)
self.assertEqual(self.mock_response, response) response, content = http.request(self.test_uri)
self.assertEqual(self.mock_content, content) self.assertEqual(self.mock_response, response)
self.assertEqual(self.mock_content, content)
def test_authorizedhttp_request_forces_user_agent_no_provided_headers(self): def test_authorizedhttp_request_forces_user_agent_no_provided_headers(self):
authorized_http = transport.AuthorizedHttp( authorized_http = transport.AuthorizedHttp(self.mock_credentials,
self.mock_credentials, http=self.mock_http) http=self.mock_http)
authorized_http.request(self.test_uri) authorized_http.request(self.test_uri)
headers = self.mock_http.request.call_args[1]['headers'] headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', headers) self.assertIn('user-agent', headers)
self.assertIn(transport.GAM_USER_AGENT, headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, headers['user-agent'])
def test_authorizedhttp_request_forces_user_agent_no_agent_in_headers(self): def test_authorizedhttp_request_forces_user_agent_no_agent_in_headers(self):
authorized_http = transport.AuthorizedHttp( authorized_http = transport.AuthorizedHttp(self.mock_credentials,
self.mock_credentials, http=self.mock_http) http=self.mock_http)
fake_request_headers = {'some-header-thats-not-a-user-agent': 'someData'} fake_request_headers = {
'some-header-thats-not-a-user-agent': 'someData'
}
authorized_http.request(self.test_uri, headers=fake_request_headers) authorized_http.request(self.test_uri, headers=fake_request_headers)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
self.assertIn('some-header-thats-not-a-user-agent', final_headers) self.assertIn('some-header-thats-not-a-user-agent', final_headers)
self.assertEqual('someData', self.assertEqual('someData',
final_headers['some-header-thats-not-a-user-agent']) final_headers['some-header-thats-not-a-user-agent'])
def test_authorizedhttp_request_forces_user_agent_with_another_agent_in_headers( def test_authorizedhttp_request_forces_user_agent_with_another_agent_in_headers(
self): self):
authorized_http = transport.AuthorizedHttp( authorized_http = transport.AuthorizedHttp(self.mock_credentials,
self.mock_credentials, http=self.mock_http) http=self.mock_http)
headers_with_user_agent = {'user-agent': 'existing-user-agent'} headers_with_user_agent = {'user-agent': 'existing-user-agent'}
authorized_http.request(self.test_uri, headers=headers_with_user_agent) authorized_http.request(self.test_uri, headers=headers_with_user_agent)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn('existing-user-agent', final_headers['user-agent']) self.assertIn('existing-user-agent', final_headers['user-agent'])
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
def test_authorizedhttp_request_same_user_agent_already_in_headers(self): def test_authorizedhttp_request_same_user_agent_already_in_headers(self):
authorized_http = transport.AuthorizedHttp( authorized_http = transport.AuthorizedHttp(self.mock_credentials,
self.mock_credentials, http=self.mock_http) http=self.mock_http)
same_user_agent_header = {'user-agent': transport.GAM_USER_AGENT} same_user_agent_header = {'user-agent': transport.GAM_USER_AGENT}
authorized_http.request(self.test_uri, headers=same_user_agent_header) authorized_http.request(self.test_uri, headers=same_user_agent_header)
final_headers = self.mock_http.request.call_args[1]['headers'] final_headers = self.mock_http.request.call_args[1]['headers']
self.assertIn('user-agent', final_headers) self.assertIn('user-agent', final_headers)
self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent']) self.assertIn(transport.GAM_USER_AGENT, final_headers['user-agent'])
# Make sure the header wasn't duplicated # Make sure the header wasn't duplicated
self.assertEqual( self.assertEqual(len(transport.GAM_USER_AGENT),
len(transport.GAM_USER_AGENT), len(final_headers['user-agent'])) len(final_headers['user-agent']))

View File

@ -13,220 +13,264 @@ from gam import fileutils
from gam import transport from gam import transport
from gam.var import * from gam.var import *
class _DeHTMLParser(HTMLParser): class _DeHTMLParser(HTMLParser):
def __init__(self): def __init__(self):
HTMLParser.__init__(self) HTMLParser.__init__(self)
self.__text = [] self.__text = []
def handle_data(self, data): def handle_data(self, data):
self.__text.append(data) self.__text.append(data)
def handle_charref(self, name): def handle_charref(self, name):
self.__text.append(chr(int(name[1:], 16)) if name.startswith('x') else chr(int(name))) self.__text.append(
chr(int(name[1:], 16)) if name.startswith('x') else chr(int(name)))
def handle_entityref(self, name): def handle_entityref(self, name):
cp = name2codepoint.get(name) cp = name2codepoint.get(name)
if cp: if cp:
self.__text.append(chr(cp)) self.__text.append(chr(cp))
else: else:
self.__text.append('&'+name) self.__text.append('&' + name)
def handle_starttag(self, tag, attrs): def handle_starttag(self, tag, attrs):
if tag == 'p': if tag == 'p':
self.__text.append('\n\n') self.__text.append('\n\n')
elif tag == 'br': elif tag == 'br':
self.__text.append('\n') self.__text.append('\n')
elif tag == 'a': elif tag == 'a':
for attr in attrs: for attr in attrs:
if attr[0] == 'href': if attr[0] == 'href':
self.__text.append(f'({attr[1]}) ') self.__text.append(f'({attr[1]}) ')
break break
elif tag == 'div': elif tag == 'div':
if not attrs: if not attrs:
self.__text.append('\n') self.__text.append('\n')
elif tag in {'http:', 'https'}: elif tag in {'http:', 'https'}:
self.__text.append(f' ({tag}//{attrs[0][0]}) ') self.__text.append(f' ({tag}//{attrs[0][0]}) ')
def handle_startendtag(self, tag, attrs): def handle_startendtag(self, tag, attrs):
if tag == 'br': if tag == 'br':
self.__text.append('\n\n') self.__text.append('\n\n')
def text(self):
return re.sub(r'\n{2}\n+', '\n\n',
re.sub(r'\n +', '\n', ''.join(self.__text))).strip()
def text(self):
return re.sub(r'\n{2}\n+', '\n\n', re.sub(r'\n +', '\n', ''.join(self.__text))).strip()
def dehtml(text): def dehtml(text):
try: try:
parser = _DeHTMLParser() parser = _DeHTMLParser()
parser.feed(str(text)) parser.feed(str(text))
parser.close() parser.close()
return parser.text() return parser.text()
except: except:
from traceback import print_exc from traceback import print_exc
print_exc(file=sys.stderr) print_exc(file=sys.stderr)
return text return text
def indentMultiLineText(message, n=0): def indentMultiLineText(message, n=0):
return message.replace('\n', '\n{0}'.format(' ' * n)).rstrip() return message.replace('\n', '\n{0}'.format(' ' * n)).rstrip()
def flatten_json(structure, key='', path='', flattened=None, listLimit=None): def flatten_json(structure, key='', path='', flattened=None, listLimit=None):
if flattened is None: if flattened is None:
flattened = {} flattened = {}
if not isinstance(structure, (dict, list)): if not isinstance(structure, (dict, list)):
flattened[((path + '.') if path else '') + key] = structure flattened[((path + '.') if path else '') + key] = structure
elif isinstance(structure, list): elif isinstance(structure, list):
for i, item in enumerate(structure): for i, item in enumerate(structure):
if listLimit and (i >= listLimit): if listLimit and (i >= listLimit):
break break
flatten_json(item, f'{i}', '.'.join([item for item in [path, key] if item]), flattened=flattened, listLimit=listLimit) flatten_json(item,
else: f'{i}',
for new_key, value in list(structure.items()): '.'.join([item for item in [path, key] if item]),
if new_key in ['kind', 'etag', '@type']: flattened=flattened,
continue listLimit=listLimit)
if value == NEVER_TIME: else:
value = 'Never' for new_key, value in list(structure.items()):
flatten_json(value, new_key, '.'.join([item for item in [path, key] if item]), flattened=flattened, listLimit=listLimit) if new_key in ['kind', 'etag', '@type']:
return flattened continue
if value == NEVER_TIME:
value = 'Never'
flatten_json(value,
new_key,
'.'.join([item for item in [path, key] if item]),
flattened=flattened,
listLimit=listLimit)
return flattened
def formatTimestampYMD(timestamp): def formatTimestampYMD(timestamp):
return datetime.datetime.fromtimestamp(int(timestamp)/1000).strftime('%Y-%m-%d') return datetime.datetime.fromtimestamp(int(timestamp) /
1000).strftime('%Y-%m-%d')
def formatTimestampYMDHMS(timestamp): def formatTimestampYMDHMS(timestamp):
return datetime.datetime.fromtimestamp(int(timestamp)/1000).strftime('%Y-%m-%d %H:%M:%S') return datetime.datetime.fromtimestamp(int(timestamp) /
1000).strftime('%Y-%m-%d %H:%M:%S')
def formatTimestampYMDHMSF(timestamp): def formatTimestampYMDHMSF(timestamp):
return str(datetime.datetime.fromtimestamp(int(timestamp)/1000)) return str(datetime.datetime.fromtimestamp(int(timestamp) / 1000))
def formatFileSize(fileSize): def formatFileSize(fileSize):
if fileSize == 0: if fileSize == 0:
return '0kb' return '0kb'
if fileSize < ONE_KILO_BYTES: if fileSize < ONE_KILO_BYTES:
return '1kb' return '1kb'
if fileSize < ONE_MEGA_BYTES: if fileSize < ONE_MEGA_BYTES:
return f'{fileSize // ONE_KILO_BYTES}kb' return f'{fileSize // ONE_KILO_BYTES}kb'
if fileSize < ONE_GIGA_BYTES: if fileSize < ONE_GIGA_BYTES:
return f'{fileSize // ONE_MEGA_BYTES}mb' return f'{fileSize // ONE_MEGA_BYTES}mb'
return f'{fileSize // ONE_GIGA_BYTES}gb' return f'{fileSize // ONE_GIGA_BYTES}gb'
def formatMilliSeconds(millis): def formatMilliSeconds(millis):
seconds, millis = divmod(millis, 1000) seconds, millis = divmod(millis, 1000)
minutes, seconds = divmod(seconds, 60) minutes, seconds = divmod(seconds, 60)
hours, minutes = divmod(minutes, 60) hours, minutes = divmod(minutes, 60)
return f'{hours:02d}:{minutes:02d}:{seconds:02d}' return f'{hours:02d}:{minutes:02d}:{seconds:02d}'
def integerLimits(minVal, maxVal, item='integer'): def integerLimits(minVal, maxVal, item='integer'):
if (minVal is not None) and (maxVal is not None): if (minVal is not None) and (maxVal is not None):
return f'{item} {minVal}<=x<={maxVal}' return f'{item} {minVal}<=x<={maxVal}'
if minVal is not None: if minVal is not None:
return f'{item} x>={minVal}' return f'{item} x>={minVal}'
if maxVal is not None: if maxVal is not None:
return f'{item} x<={maxVal}' return f'{item} x<={maxVal}'
return f'{item} x' return f'{item} x'
def get_string(i, item, optional=False, minLen=1, maxLen=None): def get_string(i, item, optional=False, minLen=1, maxLen=None):
if i < len(sys.argv): if i < len(sys.argv):
argstr = sys.argv[i] argstr = sys.argv[i]
if argstr: if argstr:
if (len(argstr) >= minLen) and ((maxLen is None) or (len(argstr) <= maxLen)): if (len(argstr) >= minLen) and ((maxLen is None) or
return argstr (len(argstr) <= maxLen)):
controlflow.system_error_exit(2, f'expected <{integerLimits(minLen, maxLen, "string length")} for {item}>') return argstr
if optional or (minLen == 0): controlflow.system_error_exit(
return '' 2,
controlflow.system_error_exit(2, f'expected a Non-empty <{item}>') f'expected <{integerLimits(minLen, maxLen, "string length")} for {item}>'
elif optional: )
return '' if optional or (minLen == 0):
controlflow.system_error_exit(2, f'expected a <{item}>') return ''
controlflow.system_error_exit(2, f'expected a Non-empty <{item}>')
elif optional:
return ''
controlflow.system_error_exit(2, f'expected a <{item}>')
def get_delta(argstr, pattern): def get_delta(argstr, pattern):
tg = pattern.match(argstr.lower()) tg = pattern.match(argstr.lower())
if tg is None: if tg is None:
return None return None
sign = tg.group(1) sign = tg.group(1)
delta = int(tg.group(2)) delta = int(tg.group(2))
unit = tg.group(3) unit = tg.group(3)
if unit == 'y': if unit == 'y':
deltaTime = datetime.timedelta(days=delta*365) deltaTime = datetime.timedelta(days=delta * 365)
elif unit == 'w': elif unit == 'w':
deltaTime = datetime.timedelta(weeks=delta) deltaTime = datetime.timedelta(weeks=delta)
elif unit == 'd': elif unit == 'd':
deltaTime = datetime.timedelta(days=delta) deltaTime = datetime.timedelta(days=delta)
elif unit == 'h': elif unit == 'h':
deltaTime = datetime.timedelta(hours=delta) deltaTime = datetime.timedelta(hours=delta)
elif unit == 'm': elif unit == 'm':
deltaTime = datetime.timedelta(minutes=delta) deltaTime = datetime.timedelta(minutes=delta)
if sign == '-': if sign == '-':
return -deltaTime return -deltaTime
return deltaTime return deltaTime
def get_delta_date(argstr): def get_delta_date(argstr):
deltaDate = get_delta(argstr, DELTA_DATE_PATTERN) deltaDate = get_delta(argstr, DELTA_DATE_PATTERN)
if deltaDate is None: if deltaDate is None:
controlflow.system_error_exit(2, f'expected a <{DELTA_DATE_FORMAT_REQUIRED}>; got {argstr}') controlflow.system_error_exit(
return deltaDate 2, f'expected a <{DELTA_DATE_FORMAT_REQUIRED}>; got {argstr}')
return deltaDate
def get_delta_time(argstr): def get_delta_time(argstr):
deltaTime = get_delta(argstr, DELTA_TIME_PATTERN) deltaTime = get_delta(argstr, DELTA_TIME_PATTERN)
if deltaTime is None: if deltaTime is None:
controlflow.system_error_exit(2, f'expected a <{DELTA_TIME_FORMAT_REQUIRED}>; got {argstr}') controlflow.system_error_exit(
return deltaTime 2, f'expected a <{DELTA_TIME_FORMAT_REQUIRED}>; got {argstr}')
return deltaTime
def get_yyyymmdd(argstr, minLen=1, returnTimeStamp=False, returnDateTime=False): def get_yyyymmdd(argstr, minLen=1, returnTimeStamp=False, returnDateTime=False):
argstr = argstr.strip() argstr = argstr.strip()
if argstr: if argstr:
if argstr[0] in ['+', '-']: if argstr[0] in ['+', '-']:
today = datetime.date.today() today = datetime.date.today()
argstr = (datetime.datetime(today.year, today.month, today.day)+get_delta_date(argstr)).strftime(YYYYMMDD_FORMAT) argstr = (datetime.datetime(today.year, today.month, today.day) +
try: get_delta_date(argstr)).strftime(YYYYMMDD_FORMAT)
dateTime = datetime.datetime.strptime(argstr, YYYYMMDD_FORMAT) try:
if returnTimeStamp: dateTime = datetime.datetime.strptime(argstr, YYYYMMDD_FORMAT)
return time.mktime(dateTime.timetuple())*1000 if returnTimeStamp:
if returnDateTime: return time.mktime(dateTime.timetuple()) * 1000
return dateTime if returnDateTime:
return argstr return dateTime
except ValueError: return argstr
controlflow.system_error_exit(2, f'expected a <{YYYYMMDD_FORMAT_REQUIRED}>; got {argstr}') except ValueError:
elif minLen == 0: controlflow.system_error_exit(
return '' 2, f'expected a <{YYYYMMDD_FORMAT_REQUIRED}>; got {argstr}')
controlflow.system_error_exit(2, f'expected a <{YYYYMMDD_FORMAT_REQUIRED}>') elif minLen == 0:
return ''
controlflow.system_error_exit(2, f'expected a <{YYYYMMDD_FORMAT_REQUIRED}>')
def get_time_or_delta_from_now(time_string): def get_time_or_delta_from_now(time_string):
"""Get an ISO 8601 time or a positive/negative delta applied to now. """Get an ISO 8601 time or a positive/negative delta applied to now.
Args: Args:
time_string (string): The time or delta (e.g. '2017-09-01T12:34:56Z' or '-4h') time_string (string): The time or delta (e.g. '2017-09-01T12:34:56Z' or '-4h')
Returns: Returns:
string: iso8601 formatted datetime in UTC. string: iso8601 formatted datetime in UTC.
""" """
time_string = time_string.strip().upper() time_string = time_string.strip().upper()
if time_string: if time_string:
if time_string[0] not in ['+', '-']: if time_string[0] not in ['+', '-']:
return time_string return time_string
return (datetime.datetime.utcnow() + get_delta_time(time_string)).isoformat() + 'Z' return (datetime.datetime.utcnow() +
controlflow.system_error_exit(2, f'expected a <{YYYYMMDDTHHMMSS_FORMAT_REQUIRED}>') get_delta_time(time_string)).isoformat() + 'Z'
controlflow.system_error_exit(
2, f'expected a <{YYYYMMDDTHHMMSS_FORMAT_REQUIRED}>')
def get_row_filter_date_or_delta_from_now(date_string): def get_row_filter_date_or_delta_from_now(date_string):
"""Get an ISO 8601 date or a positive/negative delta applied to now. """Get an ISO 8601 date or a positive/negative delta applied to now.
Args: Args:
date_string (string): The time or delta (e.g. '2017-09-01' or '-4y') date_string (string): The time or delta (e.g. '2017-09-01' or '-4y')
Returns: Returns:
string: iso8601 formatted datetime in UTC. string: iso8601 formatted datetime in UTC.
""" """
date_string = date_string.strip().upper() date_string = date_string.strip().upper()
if date_string: if date_string:
if date_string[0] in ['+', '-']: if date_string[0] in ['+', '-']:
deltaDate = get_delta(date_string, DELTA_DATE_PATTERN) deltaDate = get_delta(date_string, DELTA_DATE_PATTERN)
if deltaDate is None: if deltaDate is None:
return (False, DELTA_DATE_FORMAT_REQUIRED) return (False, DELTA_DATE_FORMAT_REQUIRED)
today = datetime.date.today() today = datetime.date.today()
return (True, (datetime.datetime(today.year, today.month, today.day)+deltaDate).isoformat()+'Z') return (True,
try: (datetime.datetime(today.year, today.month, today.day) +
deltaDate = dateutil.parser.parse(date_string, ignoretz=True) deltaDate).isoformat() + 'Z')
return (True, datetime.datetime(deltaDate.year, deltaDate.month, deltaDate.day).isoformat()+'Z') try:
except ValueError: deltaDate = dateutil.parser.parse(date_string, ignoretz=True)
pass return (True,
return (False, YYYYMMDD_FORMAT_REQUIRED) datetime.datetime(deltaDate.year, deltaDate.month,
deltaDate.day).isoformat() + 'Z')
except ValueError:
pass
return (False, YYYYMMDD_FORMAT_REQUIRED)
def get_row_filter_time_or_delta_from_now(time_string): def get_row_filter_time_or_delta_from_now(time_string):
"""Get an ISO 8601 time or a positive/negative delta applied to now. """Get an ISO 8601 time or a positive/negative delta applied to now.
Args: Args:
time_string (string): The time or delta (e.g. '2017-09-01T12:34:56Z' or '-4h') time_string (string): The time or delta (e.g. '2017-09-01T12:34:56Z' or '-4h')
Returns: Returns:
@ -234,51 +278,57 @@ def get_row_filter_time_or_delta_from_now(time_string):
Exits: Exits:
2: Not a valid delta. 2: Not a valid delta.
""" """
time_string = time_string.strip().upper() time_string = time_string.strip().upper()
if time_string: if time_string:
if time_string[0] in ['+', '-']: if time_string[0] in ['+', '-']:
deltaTime = get_delta(time_string, DELTA_TIME_PATTERN) deltaTime = get_delta(time_string, DELTA_TIME_PATTERN)
if deltaTime is None: if deltaTime is None:
return (False, DELTA_TIME_FORMAT_REQUIRED) return (False, DELTA_TIME_FORMAT_REQUIRED)
return (True, (datetime.datetime.utcnow()+deltaTime).isoformat()+'Z') return (True,
try: (datetime.datetime.utcnow() + deltaTime).isoformat() + 'Z')
deltaTime = dateutil.parser.parse(time_string, ignoretz=True) try:
return (True, deltaTime.isoformat()+'Z') deltaTime = dateutil.parser.parse(time_string, ignoretz=True)
except ValueError: return (True, deltaTime.isoformat() + 'Z')
pass except ValueError:
return (False, YYYYMMDDTHHMMSS_FORMAT_REQUIRED) pass
return (False, YYYYMMDDTHHMMSS_FORMAT_REQUIRED)
def get_date_zero_time_or_full_time(time_string): def get_date_zero_time_or_full_time(time_string):
time_string = time_string.strip() time_string = time_string.strip()
if time_string: if time_string:
if YYYYMMDD_PATTERN.match(time_string): if YYYYMMDD_PATTERN.match(time_string):
return get_yyyymmdd(time_string)+'T00:00:00.000Z' return get_yyyymmdd(time_string) + 'T00:00:00.000Z'
return get_time_or_delta_from_now(time_string) return get_time_or_delta_from_now(time_string)
controlflow.system_error_exit(2, f'expected a <{YYYYMMDDTHHMMSS_FORMAT_REQUIRED}>') controlflow.system_error_exit(
2, f'expected a <{YYYYMMDDTHHMMSS_FORMAT_REQUIRED}>')
def md5_matches_file(local_file, expected_md5, exitOnError): def md5_matches_file(local_file, expected_md5, exitOnError):
f = fileutils.open_file(local_file, 'rb') f = fileutils.open_file(local_file, 'rb')
hash_md5 = md5() hash_md5 = md5()
for chunk in iter(lambda: f.read(4096), b""): for chunk in iter(lambda: f.read(4096), b''):
hash_md5.update(chunk) hash_md5.update(chunk)
actual_hash = hash_md5.hexdigest() actual_hash = hash_md5.hexdigest()
if exitOnError and actual_hash != expected_md5: if exitOnError and actual_hash != expected_md5:
controlflow.system_error_exit(6, f'actual hash was {actual_hash}. Exiting on corrupt file.') controlflow.system_error_exit(
return actual_hash == expected_md5 6, f'actual hash was {actual_hash}. Exiting on corrupt file.')
return actual_hash == expected_md5
URL_SHORTENER_ENDPOINT = 'https://gam-shortn.appspot.com/create' URL_SHORTENER_ENDPOINT = 'https://gam-shortn.appspot.com/create'
def shorten_url(long_url, httpc=None): def shorten_url(long_url, httpc=None):
if not httpc: if not httpc:
httpc = transport.create_http(timeout=10) httpc = transport.create_http(timeout=10)
headers = {'Content-Type': 'application/json', 'User-Agent': GAM_INFO} headers = {'Content-Type': 'application/json', 'User-Agent': GAM_INFO}
try: try:
payload = json.dumps({'long_url': long_url}) payload = json.dumps({'long_url': long_url})
resp, content = httpc.request( resp, content = httpc.request(URL_SHORTENER_ENDPOINT,
URL_SHORTENER_ENDPOINT, 'POST',
'POST', payload,
payload, headers=headers)
headers=headers)
except: except:
return long_url return long_url
if resp.status != 200: if resp.status != 200:

File diff suppressed because it is too large Load Diff

View File

@ -9,7 +9,7 @@ b = sys.argv[2]
#result = version.parse(a) >= version.parse(b) #result = version.parse(a) >= version.parse(b)
result = LooseVersion(a) >= LooseVersion(b) result = LooseVersion(a) >= LooseVersion(b)
if result: if result:
print('OK: %s is equal or newer than %s' % (a, b)) print('OK: %s is equal or newer than %s' % (a, b))
else: else:
print('ERROR: %s is older than %s' % (a, b)) print('ERROR: %s is older than %s' % (a, b))
sys.exit(not result) sys.exit(not result)

View File

@ -57,7 +57,7 @@ if [ $SSLRESULT -ne 0 ] || [[ "$SSLVER" != "OpenSSL $BUILD_OPENSSL_VERSION "* ]]
rm -rf ssl rm -rf ssl
mkdir python mkdir python
mkdir ssl mkdir ssl
# Compile latest OpenSSL # Compile latest OpenSSL
wget --quiet https://www.openssl.org/source/openssl-$BUILD_OPENSSL_VERSION.tar.gz wget --quiet https://www.openssl.org/source/openssl-$BUILD_OPENSSL_VERSION.tar.gz
echo "Extracting OpenSSL..." echo "Extracting OpenSSL..."
@ -107,4 +107,4 @@ cd $whereibelong
$pip install --upgrade pip $pip install --upgrade pip
$pip list --outdated --format=freeze | grep -v '^\-e' | cut -d = -f 1 | xargs -n1 $pip install -U $pip list --outdated --format=freeze | grep -v '^\-e' | cut -d = -f 1 | xargs -n1 $pip install -U
$pip install --upgrade -r src/requirements.txt $pip install --upgrade -r src/requirements.txt
$pip install --upgrade https://github.com/pyinstaller/pyinstaller/archive/develop.tar.gz $pip install --upgrade https://github.com/pyinstaller/pyinstaller/archive/develop.tar.gz

View File

@ -9,4 +9,4 @@ cfg['refresh_token'] = os.getenv('refresh_%s' % jid)
gampath = os.getenv('gampath') gampath = os.getenv('gampath')
out_file = os.path.join(gampath, 'oauth2.txt') out_file = os.path.join(gampath, 'oauth2.txt')
with open(out_file, 'w') as f: with open(out_file, 'w') as f:
json.dump(cfg, f) json.dump(cfg, f)