- Upgrade to Microsoft Office Pro and Windows 11 Pro with this bundle for 87% off
- Get 3 months of Xbox Game Pass Ultimate for 28% off
- Buy a Microsoft Project Pro or Microsoft Visio Pro license for just $18 with this deal
- How I optimized the cheapest 98-inch TV available to look and sound incredible (and it's $1,000 off)
- The best blood pressure watches of 2024
Public Cloud Account Lifecycle Management: Automating Creating and Deleting New Accounts
By Martin Yankov
Cloud technologies have become a ubiquitous part of organizations’ IT strategies. Faster performance and enhanced scalability options—both horizontally and vertically—enable companies to better reach their customers and deliver value in shorter cycles. With rapidly changing business requirements, clouds enable better flexibility for organizations and save substantial effort and money. Service providers, such as VMware, are therefore able to respond quickly to changing needs, deliver value, innovate faster, and iterate more rapidly, leading to higher efficiency and increased adoption of cloud services.
Popular public cloud providers have different product catalogs, but the service offerings and their usage are typically similar. This makes it easier for IT to adopt these services and manage them; however, processes and programs need to be in place to help scale accounts up or down (adding more users, then deleting ones no longer needed) while providing secure access and cost optimization, preventing inappropriate use of resources, and providing inventory management. This is where account lifecycle management comes in. Here, we’ll provide code samples that work on top public clouds—you may use these to automate account creation and deletion.
To achieve account management at scale at VMware, we addressed the topic right from the beginning of the lifecycle of each public cloud entity’s creation. Whether they’re called accounts, projects, or subscriptions, the clouds we’re going to observe are similar and act as the gateway to using any of the other service offerings. We present how we have managed to make the process of creating and deleting these logical containers work across multiple cloud providers.
This blog post details the similarities and differences between the popular cloud vendors and shows how we did account lifecycle management at VMware for different clouds using code samples.
Overview of Cloud Vendors
Amazon Web Services
Amazon Web Services (AWS) is one of the most popular on-demand cloud computing platforms used by individuals, companies, and governments alike. To provision AWS, you must use Identity and Access Management (IAM). With it, you can create and manage cloud resources like making accounts and giving them access to specific cloud services. You can also access billing services. You can manage account provisioning on a standalone basis or by using AWS Organizations, the former being recommended for smaller and the latter for larger organizations.
AWS Organizations lets you quickly scale environments with scripts to create new AWS accounts on-demand; you can also easily manage account permissions. AWS Organizations also supports the creation of parent-child relationships using Organizational Units, which you can use to separate accounts into different groups, such as teams, business units, or services, while retaining management and overview in one place—at the uppermost account level.
AWS Organizations enables the logical separation of billing and management of its organization members (separate AWS accounts). AWS Organizations also allows the creation of new member AWS accounts through a dedicated API.
Microsoft Azure
Azure is Microsoft’s cloud computing service for application management via Microsoft-managed data centers. You can provision Azure resources in logical containers called Azure subscriptions. Each subscription contains details for the hosted cloud resources, enabling a logical separation for billing or access management purposes. To create subscriptions, Azure requires using a Microsoft account that, on its own, acts as a user identifier for login and role assignments.
Aside from subscriptions, Azure has the notion of Azure directories, which are also required for users to provision specific workloads. These Azure directories provide suitable means for integration with existing Enterprise Active Directories to homogenize the user pool base. Furthermore, each Azure Directory contains a Tenant Root Management Group, referring to the Azure Directory as a tenant. Each subscription, by default, is provisioned under this Management Group, like AWS Organizations, which can also be split further into child-management groups to facilitate better logical separation.
Azure offers Azure Enterprise Agreements for larger enterprises to manage the billing of linked subscriptions. Enterprise Agreements decouple access management from billing and, similarly to AWS, can enable Azure users to create subscriptions within the enterprise agreement and ensure billing is ready from the get-go.
Google Cloud Platform
Google Cloud Platform (GCP) is a suite of cloud computing services running on the same infrastructure that Google uses internally. GCP has the notion of projects to allow the use of the offered services.
“A Google Cloud Platform (GCP) project is a set of configuration settings that define how your app interacts with Google services and what resources it uses” (cited from https://developers.google.com/workspace/marketplace/create-gcp-project).
Like Azure, GCP requires the use of a valid Google account to set up access to projects using Role Assignments. Google accounts can be either standalone or managed using Google Workspace, as well as other forms of Active Directory.
Like Azure and AWS, Google has also introduced resource grouping for larger enterprises, called GCP Organizations. These allow managing specific default settings for projects on a higher level, as well as logical grouping between folders and subfolders. This allows for better permission management.
A Cloud Billing Account is required to define who will foot the bill in the end for a set of Google Cloud resources in the previously mentioned projects. Standalone and organization projects must be linked to a billing account to appropriately allocate and charge the resource usage.
Access Control
There are different identity management techniques, and, in this section, we look at the two most common strategies: role-based access control (RBAC) and attribute-based access control (ABAC). The main difference is in the way each method grants access to a specific resource. RBAC techniques allow you to grant access using roles and role attachments. ABAC techniques let you determine access by evaluating a set of rules and policies according to specific attributes, such as environment, system, object, or user information. While RBAC relies on pre-defined roles, ABAC is more dynamic and uses relation-based access control. RBAC can be used to determine access control with broad strokes, while ABAC offers more granularity. AWS follows the ABAC model, where the access attribute can be a user with a dedicated username and password. GCP and Azure leverage the RBAC model, where a specific role, either custom or built-in, must be attached to a principal.
Provisioning
These providers offer multiple APIs and software development kits (SDKs) in different programming languages for developers using different development/IT environments. In this article, our focus is the creation of many accounts with a pre-established billing construct.
AWS
For enterprise users, Amazon offers the ability to provision AWS Accounts through the AWS Organization service that automatically sets up the billing at the uppermost level, which is considered a management account. Here, we’re looking at the creation of a new account. The new account status needs to change to Active for us to operate on it further. The two input parameters we need to provide for account creation are
- Root email address (required) – As discussed above, AWS requires the use of a root email address that can be used to access the account; it is best practice for larger enterprises to avoid using actual email addresses as this root user email can be used for only one account.
- Account name (required) – This is the account’s name, which is also mandatory.
Code sample in Python for creating an AWS account:
import boto3
def request_new_account(email: str, account_name: str, client) –> dict:
try:
response = client.create_account(Email=email, AccountName=account_name)
return response
except Exception as e:
raise e
def get_account_state(request_id: str, client) –> str:
try:
response = client.describe_create_account_status(CreateAccountRequestId=request_id)
return response[‘CreateAccountStatus’][‘State’]
except Exception as e:
raise e
client = boto3.client(‘organizations’)
request = request_new_account(email=‘awsresourcerootemail-123312@example.com’, account_name=‘Production Environment’)
status = request[‘CreateAccountStatus’][‘State’]
while(status == ‘IN_PROGRESS’):
status = get_account_state(request_id=request[‘CreateAccountStatus’][‘Id’])
After successfully creating the account, you can access it through the provided root user email and the password.
Azure
Azure’s enterprise feature is called Enterprise Agreement. Like AWS, it allows us to create subscriptions within the same billing scope to manage billing at the topmost level. Unlike AWS, Azure does not require specifying a specific email address, and subscriptions don’t need to be assigned upon creation—the Account Creator is originally the owner of each newly created subscription. Some input parameters that can be provided are
- Alias – Unique identifier for the subscription. This parameter is not going to be visible later.
- Billing Account Id – The account owner’s ID that will create the subscription.
- Enrollment Account Id – The id of the enterprise enrollment where the subscription will be created.
- Display Name – The name of the subscription to display in Azure Portal.
- Workload – The environment to use for the subscription. Azure offers Dev/Test and Production subscriptions depending on the needs.
Code sample in Python for creating an Azure subscription:
import requests
import string
import random
import json
def get_access_token(token_url: str, client_id: str, client_secret: str, resource: str):
try:
get_token_data = {
“grant_type”: “client_credentials”,
“client_id”: client_id,
“resource”: resource,
“client_secret”: client_secret,
}
token_response = requests.post(token_url, data=get_token_data)
token_response_json = json.loads(token_response.text)
access_token = token_response_json[“access_token”]
return access_token
except Exception as e:
raise e
def request_new_subscription(
display_name: str,
alias: str,
environment: str,
billing_account_id: str,
enrollment_account_id: str,
access_token: str
):
try:
data = {
“properties”: {
“billingScope”: f”/providers/Microsoft.Billing/BillingAccounts/{billing_account_id}/enrollmentAccounts/{enrollment_account_id}”,
“DisplayName”: display_name,
“Workload”: environment,
}
}
response = requests.put(
f”https://management.azure.com/providers/Microsoft.Subscription/aliases/{alias}?api-version=2020-09-01″,
data=json.dumps(data),
headers={
“Authorization”: f”Bearer {access_token}”,
“Content-type”: “application/json”
},
)
return response.json()
except Exception as e:
raise e
def check_subscription_status(alias: str, access_token: str) -> str:
try:
response = requests.get(
f”https://management.azure.com/providers/Microsoft.Subscription/aliases/{alias}?api-version=2020-09-01″,
headers={
“Authorization”: f”Bearer {access_token}”,
“Content-type”: “application/json”
},
)
return response.json()[‘properties’][‘provisioningState’]
except Exception as e:
raise e
display_name = ‘Production Service’
alias = “-“.join(display_name.lower().split()) + “”.join(random.choice(string.ascii_lowercase) for i in range(5))
access_token = get_access_token(
token_url=f”https://login.microsoftonline.com/12312312312/oauth2/token“,
client_id=’spn-123131-random-client-id’,
client_secret=’spn-123131-random-client-top-secret-value’,
resource=”https://management.azure.com“,
)
request = request_new_subscription(
display_name=display_name,
alias=alias,
environment= ‘Production’,
billing_account_id=’1234567890′,
enrollment_account_id=’1234567′,
access_token=access_token
)
status = request[‘properties’][‘provisioningState’]
while(status != ‘Accepted’):
status = check_subscription_status(
alias=’somerandomalias123′,
access_token=access_token
)
After the subscription’s state moves to succeeded, it’s ready for use.
GCP
Project creation works in GCP, similar to Azure and AWS. We initiate the creation and then wait for it to become active. Since we are using GCP Organizations, we need to specify a parent folder where the project will get created. A folder is nothing other than a sub-container of an organization. The parameters we need to consider for GCP Project Creation are
- Project Name – A friendly name for the project.
- Project Id – For simplicity, we’ll just use the project name to build the project ID.
- Parent Folder – The parent folder ID in the GCP organization.
We’ll leverage service user credentials with the necessary permission to create projects in the organization.
Code sample in Python for creating projects:
import googleapiclient.discovery
from google.oauth2 import service_account
def create_new_project(name: str, project_id: str, parent_folder: str, client):
try:
project_body = {
“name”: name,
“projectId”: project_id,
“parent”: {“type”: “folder”, “id”: parent_folder},
}
response = client.projects().create(body=project_body).execute()
return response
except Exception as e:
raise e
def check_project_status(project_id: str, client):
try:
response = client.projects().get(project_id)
return response
except Exception as e:
raise e
credentials = service_account.Credentials.from_service_account_file(
‘/location/of/service-user-credentials-file’,
scopes=[“https://www.googleapis.com/auth/cloud-platform“]
)
client = googleapiclient.discovery.build(
‘cloudresourcemanager’,
‘v1′,
credentials=credentials,
cache_discovery=False,
)
create_new_project(
name=”Production service”,
project_id=’production-service’,
parent_folder=”123123123123″,
client=client
)
status = None
while(status != ‘ACTIVE’):
status = check_project_status(‘production-service’).get(‘lifecycleState’)
Once the GCP project is created, and in the active state, it will be available to use within the organization.
In summary, we can see that the process for public cloud provisioning is pretty consistent across AWS, Azure, and GCP. This allows us to easily provision new resources at scale in a homogenous environment and keep things standard across clouds, making managing these resources much easier later. We can configure one entry point for requests for all three clouds, enabling us also to configure the provisioning in a self-service manner. We’re going to talk more about this at a later point in this blog.
Deletion
Once the accounts, projects, or subscriptions have reached their end of life and are no longer needed, it is best practice to decommission them to reduce vulnerability to attacks and costs incurred. Luckily, we’ve also been presented with the ability to perform this operation programmatically. One important thing to note is that it is best practice to ensure that all running resources are cleaned up before the account is closed.
AWS
Like account creation, account closure can also be performed through the AWS Organizations feature. All we need is the account ID, and we can initiate the closure request. After the account is closed, it can be reactivated for a period of time in case an account is closed by accident, or it simply needs to be reactivated.
Code sample in Python for closing an account:
import boto3
def get_account_status(account_id: str, client) –> str:
try:
response = client.describe_account(AccountId=account_id)
return response[‘Account’][‘Status’]
except Exception as e:
raise e
client = boto3.client(‘organizations’)
client.close_account(AccountId=‘123456789012’)
status = None
while(status != ‘SUSPENDED’):
status = get_account_status(account_id=‘123456789012’)
Important: When an account is closed, the root user email address cannot be reused.
A few considerations, which bring us back to provisioning, is that when an account is closed, the root user email address cannot be reused.
Azure
Code sample in Python for canceling subscription:
import requests
import json
def get_access_token(token_url: str, client_id: str, client_secret: str, resource: str):
try:
get_token_data = {
“grant_type”: “client_credentials”,
“client_id”: client_id,
“resource”: resource,
“client_secret”: client_secret,
}
token_response = requests.post(token_url, data=get_token_data)
token_response_json = json.loads(token_response.text)
access_token = token_response_json[“access_token”]
return access_token
except Exception as e:
raise e
def cancel_subscription(subscription_id: str, access_token: str):
try:
response = requests.post(
f”https://management.azure.com/subscriptions/{subscription_id}/providers/Microsoft.Subscription/cancel?api-version=2021-10-01″,
headers={
“Authorization”: f”Bearer {access_token}”,
“Content-type”: “application/json”
}
)
return response.json()
except Exception as e:
print(f”There was an error canceling subscription {subscription_id}”)
print(e)
raise e
access_token = get_access_token(
token_url=f”https://login.microsoftonline.com/12312312312/oauth2/token“,
client_id=‘spn-123131-random-client-id’,
client_secret=‘spn-123131-random-client-top-secret-value’,
resource=“https://management.azure.com“,
)
request = cancel_subscription(
subscription_id=‘subscription-id-21312’,
access_token=access_token,
)
status = None
while(status != ‘SUSPENDED’):
status = get_account_status(account_id=‘123456789012’)
GCP
Code sample in Python to shut down GCP project:
import googleapiclient.discovery
from google.oauth2 import service_account
def delete_project(project_id: str, client):
try:
response = client.projects().delete(project_id=project_id).execute()
return response
except Exception as e:
raise e
def check_project_status(project_id: str, client):
try:
response = client.projects().get(project_id)
return response
except Exception as e:
raise e
credentials = service_account.Credentials.from_service_account_file(
‘/location/of/service-user-credentials-file’,
scopes=[“https://www.googleapis.com/auth/cloud-platform“]
)
client = googleapiclient.discovery.build(‘cloudresourcemanager’, ‘v3’)
delete_project(project_id=‘testproject-352012’, client=client)
Putting it all together
This blog covers account lifecycle management for AWS, Azure, and GCP. Automating the lifecycle of managing accounts makes the process more efficient when dealing with thousands of accounts. For compliance purposes, it is critical to have a trail on account creation or deletion. These requests might also need manager approval or financial review, as there are cost implications. At VMware, we have largely automated these flows using Jira Service Desk (JSD) and our own distributed workflow engine using functions on a serverless platform. Automating account creation and deletion significantly improves turn-around time and simplifies account lifecycle management.