5 Tips for Managing Amazon WorkSpaces at Scale

BY:

Scaling WorkSpaces is an opportunity to review UX, security, and access policies

Through Amazon WorkSpaces, companies can provide a secure cloud-based virtual desktop experience without the burden of managing a complex environment. It’s end-user friendly, with the same feel as the Windows experience they have been running on their local laptop or desktop. WorkSpaces not only provides easy access worldwide – it offers state-of-the-art infrastructure security, something that’s difficult to achieve on local servers.

WorkSpaces’ secure desktop-as-a-service (DaaS) enables fast provisioning and scalability as well as autoscaling, but it’s not a set-it-and-forget-it proposition. As you add new users, you’ll need to manage the user experience, modify access and security controls, and handle other tasks. While it was designed to require the smallest amount of human intervention possible, there are things you need to know about managing WorkSpaces at scale. Here are five tips.

1. You’ll have to request an increase in service limits

When you created your Amazon Web Services (AWS) account, default quotas (aka limits) were set on the number of resources you are able to create. Modification of these limits requires contacting AWS to request increases for:

  • The number of WorkSpaces per region: Since the default here is 1, you’ve likely already had to request an increase in the past for that region.
  • The number of elastic network interfaces (ENIs) per region: The default here is 5,000, and that may be enough. But if you want to scale WorkSpaces beyond that, you’ll have to contact AWS. Also keep in mind that the quota is for the entire region, not individual virtual private clouds (VPCs). Another consideration is that many different types of resources that live in your VPCs – EC2 instances, Lambda functions, and WorkSpaces – use ENIs. 

2. You will have to add customer-managed keys (CMKs)

The AWS Key Management Services (AWS KMS) is integrated into WorkSpaces so you can encrypt storage volumes using these keys. A single CMK can only encrypt 500 WorkSpaces. Managing WorkSpaces at scale requires creating new KMS CMKs.

  • When creating your new CMKs, it is important to select the appropriate users and roles to administer the keys and add the WorkSpaces management cross-account role as user of the key. Look at your AWS Accounts Detail page on the dashboard to find the Amazon Resource Name (ARN) of the resource. 
  • Add your newly-created keys to the Included Keys list in the Volume Encryption Keys section of the appropriate package detail page to enable dashboard access.

3. Automate WorkSpaces provisioning

You can use AWS Lambda to automate WorkSpaces provisioning as well as de-provisioning using directory group membership. Detailed instructions are available here, but below you can find the basic steps that allow you to build this automated solution that compares your existing directory group approval workflows to provision additional WorkSpaces

The way it works is by creating a Lambda function with VPC access. This means you can use the Python library LDAP3 to connect to directory services. Then, by using a simple LDAPS search, you can get a list of members in a directory group.

Using the Python Boto3 library to access the WorkSpaces API, you can compare the directory group members to current WorkSpaces users. Then, you create WorkSpaces for group members who do not have one and terminate WorkSpaces for users who are no longer in the group.

Creating this automated provisioning requires the following steps:

  • Store domain service account password in AWS Secrets Manager.
  • Create an IAM Policy and Role for the AWS Lambda function.
  • Create an AWS Security Group to allow the Lambda function to connect to LDAPS.
  • Create a zip file with Lambda code and dependencies.
  • Create a Lambda function.
  • Create CloudWatch Events Rule to run the Lambda function on a schedule.

The prerequisites include an AWS account, a domain services account, access to a workstation with Python and Pip installed, a VPC with access to both the internet and secure LDAP, and an existing WorkSpaces environment.

4. Refine your approach to security

When it comes to managing WorkSpaces at scale, as you add new WorkSpaces, you want to make sure your assets are secure. When you set up your WorkSpaces environment, you implemented security groups. You likely configured your security groups so external users only have HTTP and HTTPS access to specific internal websites by trusted IP addresses.

However, once you begin scaling up, you might want to consider implementing more granular access control for individual users. You can define another restrictive security group and attach it to an individual user’s WorkSpace. 

This means you can use a single directory to handle many different users with different network security requirements while ensuring that third-party users only have access to authorized data and systems. In addition to security groups, you can use your preferred host-based firewall on a given WorkSpace to limit network access to resources within the VPC.

5. Consider the user experience

Scaling up is a great time to review the user experience. Administrators control what users can personalize in their WorkSpace. The default setting allows users to personalize their WorkSpaces with their favorite settings for wallpaper, icons, shortcuts, and other items that have no impact on the overall environment.

Administrators can lock down a WorkSpace. Using Group Policy for Windows restricts a user’s ability to personalize their WorkSpace. It’s important to remember that Group Policy settings affect the user experience:

  • If you disable removable storage, you cause a login failure. Users are logged in to temporary user profiles with no access to D drive.
  • If you remove any users from the Remote Desktop Users local group via the Group Policy setting, those users cannot authenticate through the WorkSpace client applications. 
  • Removing the built-in Users group from the Allow log on local security policy means your PCoIP WorkSpaces users won’t be able to connect to their WorkSpaces through the WorkSpaces client applications. Also, your PCoIP WorkSpaces won’t receive updates to the PCoIP agent software. PCoIP agent updates might contain security and other fixes, or they might enable new features for your WorkSpaces. 
  • You can use Group Policy settings to restrict drive access, but if you restrict access to drive C or drive D, your users won’t be able to access their WorkSpaces.

Other Group Policy settings can affect audio-in access. Using the wrong power setting can cause WorkSpaces to sleep when left idle or force users to log off when they’re disconnected from a session and close all of their applications. A negative user experience can affect productivity and morale, so before you scale up, review your Group Policy settings. 

Amazon WorkSpaces offer ease of use, reliability, easy deployment and management, and supports remote work. It also offers all the tools you need to manage your WorkSpaces at scale.

Scaling up? CloudHesive can help

CloudHesive will help you get started and we’ll help you scale. Together, we’ll develop an Amazon Workspaces solution to meet your needs from a desktop image and WorkSpaces bundling perspective. We perform a detailed assessment of your current environments, develop a comprehensive project and implementation plan, and architect the next generation of your WorkSpaces and AppStream environments.

We’ve helped more than 100 companies reduce their operating costs and increase productivity with our focus on security, reliability, availability, and scalability. With over 30 years of experience, we leverage cloud-based technology to its full potential. Contact the CloudHesive team today.

Related Blogs

  • Exploring the Role of Amazon Web Services (AWS) Tools for DevOps Implementation in Cloud Projects

    Integrating DevOps best practices into cloud projects presents a few inherent challenges. With the help of AWS Tools for DevOps, processes can be streamlined for better cloud project management....

    Learn More
  • Optimizing Cloud Operations and Cost with DevOps Planning

    DevOps planning tips and tricks can help your organization balance operational efficiency and cost management. Even though implementing DevOps planning comes with many well-known benefits within the...

    Learn More
  • Key DevOps Trends: How They Shape the Future of Cloud Computing

    Staying on top of the rapidly evolving world of DevOps is challenging. Using prevalent DevOps trends can significantly impact project success in the evolution of cloud computing.  Considering the...

    Learn More