October 12, 2020

How to Replace the Self-signed Certificate for Nutanix Prism Element and Prism Central

Purpose:

Demonstration on how to replace the self-signed certificate on Nutanix Prism Element and Prism Central.

Introduction:

There are many blogs out there about how to replace the self-signed certificate in Nutanix Prism Element and Prism Central with a domain signed certificate. A lot of the blogs reference the need to create the Certificate Signing Request (CSR) in the command line of OpenSSL on a Linux or Windows machine. There are alternatives to this, the certificate can very easily be created using the Microsoft certificate snap-in and then using OpenSSL to convert the certificate into an acceptable format for Prism Element and Prism Central to use. This is useful as a lot of workloads (Citrix, VMware, etc...) are being migrated to Nutanix for the hyper-converged benefits and this eliminates the certificate warning and improves security posture for the environment.

Configuration Steps:

Launch the Microsoft Certificate Snap-in for the Local Computer.

In this case I going to create a custom request as I want to be able to define the Subject Alternative Name and use the same certificate for Prism Central and Prism Element. At a minimum there at least needs to be one Subject Alternative Name which is the Common Name or popular browsers such as Mozilla Firefox, Google Chrome or Chromium Edge will produce a certificate warning stating the Subject Alternative Name is missing.

Clikc on Personal node -> Right Click in the Certificate Snap-in -> go to All Task -> Advanced Operations -> Create Custom Request.


Click Next

Since we are creating a custom CSR and do not want to be dependent on an Active Directory Enrollment Policy, select Proceed without an Enrollment Policy

On the Template dropdown, select (No Template) Legacy Key -> Click Next

Expand Details -> Click Properties

On the Friendly Name fill in the friendly name of the certificate. In my case prism.domain.lab.

Fill in all of the certificate details such as the Common Name, Organization, Organizational Unit, Locality and State under the Subject name section. Under the Alternative Name section fill in All of the Alternative Names, in my case prism.domain.lab, prism, prismcentral.domain.lab and prismcentral. This allows for both short names and fully qualified domain names to not produce certificate warnings. 

Under the Private Key tab make sure that the Key Size is 2048 bit (always use this) and that Mark private key exportable is checked or after completing the signing the certificate cannot be exported. -> Click Apply

Save the CSR to a location for easy access

Now head over to your Domain CA Web Enrollment portal, typically from a browser go to: https://domaincafqdn/certsrv. Click Request a certificate.
Click advanced certificate request

Click Submit a certificate request by using a base-64 encoded CMS or PKCS #10 file


Open the CSR generated earlier, copy and paste the contents of this into the base-64 encoded certificate request or PKCS #10 or PKCS #7 field. On the Template select the appropriate Web Server Template defined by your Domain CA administrator. In my case Web Server 2048Bit SHA256.

Once the CSR has been submitted and the request has been answered by the domain CA the file should be saved in a place for easy access. In my case the file is named prism.domain.lab.answer.cer

Now back on the machine where the CSR was generated in the Microsoft Certificate Snap-in Right Click -> All Tasks -> Import 

Click Next


Click Browse


Browse to the previous location where the answer file was saved. -> Click Open


Click Next


Make sure the Personal Store is selected -> Click Next


Confirm settings -> Click Finish

Now the certificate needs to be exported as a PFX file which contains the private key. When exporting from Windows the private key is encrypted with a password. This will need to be retained in order to perform the next steps in OpenSSL which will extract the certificate pieces and remove the password on the private key.

We can extract the private key from a PFX to a PEM file with this command:
# openssl pkcs12 -in filename.pfx -nocerts -out key.pem

Exporting the certificate only:
# openssl pkcs12 -in filename.pfx -clcerts -nokeys -out cert.pem

Removing the password from the extracted private key:
# openssl rsa -in key.pem -out server.key


Open the .key file and remove the bag attributes and issuer information or Prism Element and Prism Central will not be able to use it. In addition the intermediate and root certificates for the Domain CA need be available and if there is both an intermediate and root they should be copied into a single file and simply named with a .cer format.


Login to either Prism Element or Prism Central


In the settings, go to SSL Certificate -> Click Replace Certificate



Make sure Import Key and Certificate are selected -> Click Next

RSA 2048 should be selected by default. Select the appropriate files. In my example prsim.domain.lab.key for the Private Key, prism.domain.lab.cert.pem for the public certificate and domain1.ca.cer for the CA certificate chain -> Click Import Files

It may take a few moments and the window should reload and now there no longer is a certificate warning on Prism. The certificate installation is the same for both Prism Element and Prism Central.

The certificate can be viewed and it should be the same certificate from before.




July 22, 2020

Don't Use Your Physical Image in Your Virtual Environment


Are you using SCCM, WDS or other deployment tools or have been asked to when deploying your virtual desktops or virtual application servers? If so, there can be some serious issues with this. I am often asked about by folks wanting to deploy Citrix or VMware Horizon images using the same image that is used for physical endpoints. Not only is this a bad idea, it can present performance ramifications and also make it so that best practices are not followed.

I always have been a believer that hand building the operating systems for virtual desktops and application delivery servers is the best approach because it ensures we know what went into the image. I understand the grips of manually installing the applications and the extra work but the extra work now can save a lot of headaches later and the reason of "this is how we build out images" is not a good enough reason to justify using the same image in the virtual environment.  Often and in most cases the deployment person and the virtual desktop environment are not the same person. They build images on physical endpoints or on a completely different hypervisor, they never optimize the image and just let things fly. Since these are physical endpoints they have dedicated hardware and rarely if ever do they experience any issues from being unoptimized. In the datacenter, on a virtual desktop or an application delivery server which share host resources with other virtual machines we need to optimize things as much as possible.

Here are two examples of recent environments where there were issues with using SCCM to deploy the same image as physical endpoints:

  1. First was in the medical field and the customer wanted to move from persistent Windows 10 desktops to pooled non-persistent virtual desktops as the administrative overhead of having a persistent desktop and having to administer the desktops with deployment tools was not feasible. Also, when presented with justifying the need of having a persistent desktop pool and having the response be “that is how we have deployed it before” there really was no reason to have it. When it came time to build the Windows 10 non-persistent image, the customer completely disregarded my suggestion on building the Windows 10 base image by hand and used WDS to deploy the “standard” image that is deployed on physical endpoints. The end result was that a known bug in the image in which the start menu stopped responding to left clicks. This bug also existed on physical endpoints but was hacked around by copying profiles over the default profile but when this was done on the non-persistent desktop image, it caused Citrix Profile Management to create temp profiles on each login. After countless days of the customer trying to remediate this, the only successful way to do so was to break out the iso and install the operating system by hand and manually installing the applications and everything is functioning correctly. 
  2. A second example of this was a large law firm migrating from an on-prem Citrix environment to VMware Workspace ONE. When it came time to build their images for the RDS Linked Clone pool they stressed a need to use an existing task sequence that was built for Windows 10 and force it to target a Window Server 2016 operating system. The issue here is that applications were installed before the RDS Session Host role was installed afterwards. It has commonly been a known and best practice for RDS Session Hosts servers that the RDS Session Host role to be installed prior to installing applications due to the need to potentially capture applications settings into the RDS shadow key. In this environment, there are small abnormalities in application behavior even today due to the incorrect installation sequence.

Long story short, when building the images for your virtual desktops and application delivery servers be careful how you approach this. As the common adage is "you can’t build a house on a bad foundation" and doing things incorrectly could lead to a bad user experience.

Johnny @mrjohnnyma

July 6, 2020

Citrix License Usage Insights

Purpose

This article describes a new Citrix Cloud service, License Server Usage Insights, that is available to all Citrix Virtual Apps and Desktop customers. Read on to find out why I think this is a big deal especially for customers that that have not transitioned to Citrix Cloud subscription licensing (aka own perpetual licenses).


Trend View: day view but month and year available


Symptom

If you have a single on-prem Virtual Apps and Desktops license server OR have completely transitioned to the Virtual Apps and Desktops service then this new on-prem License Usage Insights may not be for you because Studio gives you a good view of license consumption.

This new service may be beneficial if you have a larger environment that has gotten complicated over the years and there is not a simple answer to, "how are we doing on Citrix licenses?".

Beneficial Scenarios

  • You are a Citrix architect or admin and use Excel to calculate Virtual Apps and Desktops license usage across your enterprise
  • Or you have more than one license server (for whatever reason)
  • Or you have license servers in completely separated Active Directory forests
  • Or you have more than one license type and this includes both Virtual Apps vs Virtual Apps and Desktops OR you own both concurrent and user/device licensing
  • Or your management likes to see pretty graphs of Citrix consumption from time to time
  • Or you would like to give someone in your organization access to consumption but do not want to give them RDP access to the actual Citrix license servers
  • Or you spend time logged into citrix.cloud.com and it would be more convenient to view license consumption there

Resolution

If one or more the scenarios above apply to you then read on. License Server Usage Insights connects your on-prem license server(s) to Citrix Cloud. It can then aggregate usage from across many license servers and present them in a pretty dashboard.

Setup

  1. Upgrade your license server. You will need to be running version 11.15.0.0 or newer. Download the license server here https://www.citrix.com/downloads/licensing/. Upgrade instructions https://docs.citrix.com/en-us/licensing/current-release/upgrade.html
  2. Enable Call Home on the license server.  If you do not have this enabled you will not get the blue "Register" button shown in the screenshot below. See the link in the next step for the details.
  3. Register the license server with Citrix Cloud https://docs.citrix.com/en-us/citrix-cloud/citrix-cloud-management/citrix-cloud-on-premises-registration.html
  4. Wait 24 hours for it to report in the first time. This is a very important step.
  5. Login to citrix.cloud.com and check. As mentioned above, no matter how many times I refreshed the page, it did take 24 hours to populate. Click on the menu on the left and choose Licensing.

You didn't think I would show a real code?


Click Register


FWIW, Citrix license server upgrades were a standard practice back when I was consulting anytime I was upgrading anything in the environment. Not only would you typically get security improvements and bug fixes but it would ensure that a component upgrade would not get halted due to newer license server requirements.

That is all there is to it. I hope this gives you better visibility into your environment.

SageLike Post ID: SL0025

Applies to:

References:

Brian @sagelikebrian

June 15, 2020

Virtual Apps and Desktops in 2020

My colleagues Mayank Singh and Rob Beekmans, both Architects in Technical Marketing, packed a ton of good information into this video If you want to see what's new with Citrix Virtual Apps and Desktops, Citrix Managed Desktops, and Citrix SD-WAN in 2020 (so far) and have 91 minutes, I recommend watching the whole thing. Here are a few demos and sections that I want to provide shortcuts to.
 


 
Microsoft Teams Optimization Browser Content Redirection (BCR)
  • Browser Content Redirection renders whitelisted webpages on the endpoint and seamlessly feeds it back into the session.  Offloading video rendering to endpoints provides both a great user experience as well as reduces backend VDA resources.
  • Supported VDA browsers: Internet Explorer, Chrome, and Edge (new Chromium). Edge is in Tech Preview.
  • Overview and configuration starts here https://youtu.be/UcEmqQjdQUY?t=2925
  • Configuration and demo video of watching YouTube without and with BCR https://youtu.be/UcEmqQjdQUY?t=3250
  • Learn more in Citrix Docs
FSLogix and Office 365
  • FSLogix will only save data for a single session. It does not support accessing multiple sessions and consolidating to its profile container. Use Citrix User Profile Manager to write back profile data to FSLogix when accessing more than one session at a time https://youtu.be/UcEmqQjdQUY?t=1985

Machine Creation Services (MCS)
  • What is it and how does it work? https://youtu.be/UcEmqQjdQUY?t=656
  • Machine Creation Services Input-Output (MCSIO). Deeper dive into MCS workings. It was revamped in version 1903 for on-prem hypervisors and Azure. Allows for placement of the master and caching disk on different storage. This allows you to use HDD (vs SSD) which scales higher (more users per machine) and provides better response time for users https://youtu.be/UcEmqQjdQUY?t=1239
  • Publishing an app or desktop using Citrix Managed Desktop (Citrix TechZone) and MCS.  Keep in mind Managed Desktops has a simplified web wizard vs Studio https://youtu.be/UcEmqQjdQUY?t=754
  • Azure on-demand provisioning using MCS. It creates machines on power-on which means you only pay for what you use. Overview and demo using Apps and Desktop Service https://youtu.be/UcEmqQjdQUY?t=1073
Citrix App Layering
Business Continuity


I hope this furthers your understanding of what's new in Citrix Virtual Apps and Desktops.  Stay tuned for a bunch of exciting announcements in the second half of 2020.

March 8, 2020

Citrix Storefront + Netscaler GW Optimal Gateway Routing. 1 Farm - Multiple Zones

Citrix Storefront + Netscaler GW Optimal Gateway Routing. 1 Farm/Site - Multiple Zones


Purpose:
This post is to share how I setup Optimal Gateway Routing (HDX Routing) using zones. I found a lot of documentation on how to set this up for multiple farms (Noted below). However I wasn't able to really find detailed information on how to set this up for just zones.


A quick run down of the environment I'm working on here.
1 Citrix Virtual Apps & Desktops Farm/Site
Delivery Controllers in 2 geographically separated datacenters
Pairs of Netscalers also in the 2 datacenters noted above
9 zones within the farm. 2 within the above mentioned datacenters and 7 in offices across the nation. 
VDI's published out of the datacenters and testing publishing out from the offices

Apps published out of all locations (Initially for local access for each office) I'm sure there may be questions about how/why we are publishing apps and desktops but I don't want to get too in the weeds about that.  

Anyway so you probably came here because you ran through the other articles you google'd and didn't get what you were looking for when you were getting ready to configure the zone portion within the Storefront HDX routing settings. If you haven't read those articles yet please go run through them. See below. They have all the prerequisites you'll need to start this process. Once you have your certificates with SAN's for each of your netscaler gateways, have applied them then this is where we will pick up.


Directions


Configure Delivery Controller Zones

In your storefront config highlight your store and manage your delivery controllers for the site. Here I've created 2 different groups of controllers -- one for each datacenter.



Once you have those separated highlight the first set and click Edit

In the Edit Delivery Controller windows choose the Advanced Settings -- Settings button

  



In the Configure Advanced Settings click on the area next to the Zones field
 


In the Delivery Controller Zone Names window click on the Add button and fill in the zone names per appropriate site.
 



Once you fill in each of the zones it should look like below. Click OK. Follow this same procedure for your other zone(s).





Configure Storefront Optimal HDX Routing (AKA - Optimal Gateway Routing)



Complete the next steps after all of your zones have been configured. 


Highlight site and then click the Configure Store Settings link on the right. 
 

At this point I'm assuming you have gone through the setup of the location specific netscalers. If you haven't yet this is a good article. (https://support.citrix.com/article/CTX215663)



In the Configure Store Settings window click on the Optimal HDX Routing tab on the left. Then highlight the location that we are configuring.Once the site is highlighted click on the Manage Zones button.

  
In the Manage Zones window click on the Add button and fill in the zone names per appropriate site. 
 

After you have filled them in they should look like below as appropriate to each location. Click Ok. (This should look familiar as we did this for the delivery controllers earlier. They should mimic those settings per location)
 



Back in the Configure Store Settings window the settings should be updated like below.


Rinse and repeat this procedure for your other location(s. When completed it should look like this. Click Apply and then Click OK.

Oh and one more thing. Make sure to turn on user mapping and aggregation on Storefront to prevent users from having multiple icons of the same published app/desktop. 
https://docs.citrix.com/en-us/storefront/current-release/set-up-highly-available-multi-site-stores.html


References:
https://support.citrix.com/article/CTX215663
https://www.carlstalhood.com/storefront-cr-configuration-for-citrix-gateway/#optimal
https://www.jgspiers.com/storefront-high-availability-optimal-routing/ 

SageLike Post ID: SL0025