Automating a Kiosk-Style Deployment of Ubuntu + VMware Workstation Pro

In large lab and nested-lab environments, being able to deploy a consistent reference machine — quickly, reliably, and with minimal manual intervention — can significantly streamline operations. That’s why I put together the ubuntu_kiosk_vmware_workstation_pro project: a bootable USB-flash drive solution based on Ubuntu that automatically installs VMware Workstation Pro (version 25H2 for Linux), extracts a pre-built VM image, and presents a kiosk-style desktop that boots directly into the VM.
The end goal: plug in the USB, install the OS, reboot, and you’re immediately at a desktop that auto-logs in and launches your VM — no interactive wizardry required.

Screenshot of a Windows 10 virtual machine running VMware Workstation Player 25H2, displaying the Process Explorer tool with a list of processes and resource usage metrics.

Why this matters

Here are a few of the use-cases this addresses:

  • Standardized workstations in training labs or demo rooms: all machines behave identically.
  • Rapid redeployment of broken or repurposed machines: minimal setup time.
  • Nested lab or proof-of-concept usage where you want a “desktop that just launches the VM” experience.
  • Non-technical staff can perform re-builds without needing deep OS knowledge.
  • Re-purpose older hardware, reuse of prior older MS OS licenses to dedicated VMware images. Don’t throw them away.

How it works — architecture overview

At a high level, the workflow is:

  1. Boot from the custom USB flash drive (Ubuntu desktop image modified).
  2. The auto-installer lays down Ubuntu OS to target HDD/SSD, including offline installation of necessary .deb packages if network might not be present.
  3. A first reboot happens into the newly installed OS; during this stage the script installs VMware Workstation Pro, extracts the compressed VM image, adds desktop links, and prompts final reboot.
  4. On the second boot, the system auto-logs into a user session, launches the extracted VM via VMware Workstation Pro, and the user is immediately in the VM desktop — no login prompt.
    Overall durations are: 10–15 min for OS install + offline package install, 10–30 min for first reboot + Workstation install + VM extraction, then 2–5 min after second reboot for auto-login and auto-VM launch. GitHub Included as well are architecture diagrams and process flow images inside the repo.

Key components

Here are some of the major building blocks:

  • Custom Ubuntu ISO: Based on Ubuntu Desktop, but edited to include preseed/cloud-init style scripts (user-data/meta-data) so the OS can install with minimal input. GitHub
  • postinstall.sh & postinstall2.sh: One runs in the chroot environment to set base configuration, the other runs in the user context to set up auto-login, desktop links, Workstation shortcuts. GitHub
  • Offline .deb packages folder: To allow installation even if the workstation has no network connectivity. GitHub
  • VM image compressed and split into 4GB chunks (to accommodate FAT32 compatibility on the USB). GitHub
  • Bootloader modifications (grub.cfg, loopback.cfg) pointing to /autoinstaller folder where meta-data and user-data live. GitHub
  • user-data must be in YAML format and begin with #cloud-config and use spaces, not tabs.
  • Ensure all modified files are UTF-8 and Unix LF end-of-line encoding.

Deployment steps (summary)

Here’s a condensed version of the steps to use this project:

  1. Download the Ubuntu Desktop ISO (from Ubuntu.com). GitHub
  2. Download the Linux VMware Workstation Pro binary (from Broadcom support site). GitHub
  3. Use an ISO editor (e.g., UltraISO, WinISO) to inject the custom folder structure and files (bootloader configs, /autoinstaller directory). GitHub
  4. Burn or write the resultant modified ISO to a USB flash drive (e.g., using Rufus if on Windows). GitHub
  5. Copy any custom configuration or installation files to /media folder, e.g. VMware Workstation Pro, offline deb files, VMware Image.
  6. Boot the target workstation from the USB drive and allow the automated installer to run through its process.
  7. After reboot, verify the VM launches and auto-login is working.

Tips & considerations

– Ensure the target hardware supports booting from USB and that you’ve configured BIOS/UEFI accordingly.
– If network connectivity is limited, ensure all necessary .deb packages and VM image files are included offline.
– Monitor the storage size on the USB: large VM images and split 4GB chunks will require sufficient capacity.
– The auto-login and VM launch assume the user session will be dedicated to running that VM — this may not suit environments where the desktop is shared for other tasks.
– If you plan to update the VM image (or VMware Workstation version), you’ll will only need to update the /media folder. There is no expectations to have to rebuild the USB image completely.

VMware Workstation Pro

Download the latest Linux VMware Workstation Pro binary for Linux from Broadcom Support Site under Free Software Downloads. Note: Site requires registration for login and to download files. [as of 11/25] https://profile.broadcom.com/web/registration

A screenshot of the Broadcom Support Portal user registration page, featuring fields for entering an email address and a CAPTCHA image.

After login, you will have a generic dashboard. Select the upper left “hamburger” menu, then select “My Downloads” icon.

Split screen image showing the Broadcom Software dashboard, with a welcome message and options for 'My Dashboard' and 'My Downloads' on the left, and a dark theme with similar options on the right.

Please select the “Free Software Downloads” section. The in the search edit box, type “vmware workstation”

Screenshot of the Broadcom support website showing the 'My Downloads' section with a highlight on 'Free Software Downloads available HERE' and a search bar for 'vmware workstation'.

Select the version of VMware Workstation Pro for Linux. Click the hyperlink/checkbox to access “Term & Conditions”, then click the download icon.

Screenshot of the Broadcom website displaying the download section for VMware Workstation Pro for Linux 25H2, highlighting the release selection and terms and conditions agreement.

USB Recommendations:

Minimally use a USB 3.2 Flash Drive. If your laptop or older device supports thunderbolt (USB-C), look at using an M.2 drive with the Thunderbolt enclosure. The speed difference will amaze you. Even if you use this Thunderbolt enclosure with the USB 3.2 type A connectors, it will still be very fast.

Image showing the ports and specifications of a desktop computer model MS-01 with labels indicating USB Type-A and Type-C ports, transfer speeds, and installation times for different operating systems.

Modernizing Identity Portal Migrations with AI: Navigating Embedded Scripts and Plugin Frameworks

Introduction

The Symantec (CA/Broadcom) Identity Portal is widely used for managing IAM workflows with customizable forms, tasks, and business logic. This tool allow its business logic to be exported within the management console.

However, a major challenge exists in migrating or analyzing environments like Dev → Test → Prod . This effort can be challenging when working with these exported Portal files. Although configuration migration tools are available, reviewing and verifying changes can be difficult. Portal exports are delivered as a single compressed JSON one-liner—making it hard to identify meaningful changes (“deltas”) without involving a large manual effort.


Challenge 1: Single-Line JSON Exports from Identity Portal

Example above has over 88K characters in a single line. Try to search on that string to find the object you wish to change or update.

Identity Portal’s export format is a flat, one-line JSON string, even if the export contains hundreds of forms, layout structures, and java scripts.

Migration/Analysis Risks

  • Impossible to visually scan or diff exports.
  • Nested structures like layout, formProps, and handlers are escaped strings, sometimes double-encoded.
  • Hidden differences can result in subtle bugs between versions or environments.

A Solution

We created a series of PowerShell scripts that leverage AI to select the best key-value pairs to sort on, that would either provide the best human-readable or searchable processes to reduce the complexity and effort for migration processes. We now can isolate minor delta changes that would otherwise been hidden until a use-case was exercised later in the migration effort, which would require additional effort to be utilized.

  • Convert the one-liner export into pretty-formatted, human-readable JSON.
  • Detect and decode deeply embedded or escaped JSON strings, especially within layout or formProps.
  • Extract each form’s business logic and layout separately.

These outputs allow us to:

  • Open and analyze the data in Notepad++, with clean indentation and structure.
  • Use WinMerge or Beyond Compare to easily spot deltas between environments or versioned exports.
  • Track historical changes over time by comparing daily/weekly snapshots.

Challenge 2: Embedded JavaScript Inside Portal Forms

Identity Portal forms often include JavaScript logic directly embedded in the form definition (onLoad, onChange, onSubmit).

Migration Risks

  • JS logic is not separated from the data model or UI.
  • Inconsistent formatting or legacy syntax can cause scripts to silently fail.
  • Broken logic might not surface until after production deployment.

Suggested Solutions

  • Use PowerShell to extract JS blocks per form and store them as external .js.txt files.
  • Identify reused code patterns that should be modularized.
  • Create regression test cases for logic-heavy forms.

Challenge 3: Form Layouts with Escaped JSON Structures

The layout field in each form is often a stringified JSON object, sometimes double or triple-escaped.

ANA provides in-depth analysis of the Symantec Identity Portal business logic and embedded java-script and java plugins to assist with migration

Migration Risks

  • Malformed layout strings crash the form UI.
  • Even minor layout changes (like label order) are hard to detect.

Suggested Solutions

  • Extract and pretty-print each layout block to .layout.json files.
    • Please note: While the output is pretty-print, it is not quite JSON format, due to the escape sequences. Use these exported files as searchable/research to help isolate deltas to be corrected during the migration efforts.
  • Use WinMerge or Notepad++ for visual diffs.
  • Validate control-to-field binding consistency.

Using our understanding of the Identity Portal format for the ‘layout’ property, were able to identify methods using AI to manage the double-or-triple escaped characters that were troublesome to export consistently. Our service engagements now incorporate greater use of AI and associated APIs to support migration efforts and process modernization, with the goal of minimizing business risk for our clients and our organization.


Challenge 4: Java Plugins with Multiple Classes

Many Portal instances rely on custom Java plugins with dozens of classes, Spring beans, and services.

Migration Risks

  • Portal API changes break plugins.
  • Lack of modularity or documentation for the custom plugins.
  • Missing source code for complied custom plugins.
  • Difficult to test or rebuild.

Suggested Solutions

  • In the absence of custom source code, decompile plugins using jd-gui .
  • Rebuild with Maven/Gradle in modern IDEs.
  • Isolate logic into reusable service layers.

Testing and Validation

  • Pretty JSON confirms field mapping.
  • Layouts render in Dev, Test, and Prod.
  • Plugins respond with valid output.
  • JS handlers trigger as expected.

Tools and Techniques

  • PowerShell: Prettify JSON, extract layouts/handlers.
  • Notepad++: Review JSON and scripts.
  • WinMerge / Beyond Compare: Diff exports and track changes.
  • jd-gui Java decompilation for plugin reverse engineering.

Recommendations for Future-Proofing

  • Store layouts and handlers in Git.
  • Modularize plugin code.
  • Version control form definitions.
  • Automate validation tests in CI or staging.

Conclusion

Migrating Identity Portal environments requires more than copy-pasting exports— In the absence of proper implementation documentation around customizations, it may require reverse engineering, decoding, and differencing of deeply nested structures.

By extracting clean, readable artifacts and comparing across environments, teams will gain visibility, traceability, and confidence in their migration efforts.

Review our github collection of the above mentioned scripts. Please reach out if you would like assistance with your migration processes/challenges. We can now progress toward automation of the business logic from one environment to the next.

https://github.com/anapartner-com/identity_portal

RHEL 9.x and VMWare Workstation

VMware Workstation 17.6.2 appears to install with no issues on Red Hat 9.5. However, we do see an issue with running the GUI of vmware with an embedded library file within VMware Workstation. We see an error message when running the vmware binary regarding the library of libxcb.

Lets walk through the installation and recreate this issue.

1) Download the latest release of VMware Workstation with wget using the embedded URL provided by the VMware Workstation Help / Software Updates screen

wget https://softwareupdate.vmware.com/cds/vmw-desktop/ws/17.6.2/24409262/linux/core/VMware-Workstation-17.6.2-24409262.x86_64.bundle.tar


2) Extract the install package from tar & install VMware Workstation

sudo ./VMware-Workstation-17.6.2-24409262.x86_64.bundle


3) Validate no issue with the "modconfig" process.

sudo vmware-modconfig --console --install-all
sudo vmware-modconfig --console --install-status


4) Stop and start vmware service.   Verify no error messages.

sudo systemctl stop vmware.service
sudo systemctl start vmware.service
sudo systemctl status vmware.service


5) Attempt to start the GUI of VMware Workstation with the binary 'vmware'
# Assumption(s):
# X11 libraries are installed
#   sudo dnf install xorg-x11-xauth xorg-x11-fonts-\* xorg-x11-utils dbus-x11
# Xterm is available within VNC or SSH tool or similar to view X11 UI
#   ssh session has DISPLAY variable populated.

vmware

vmware startup error message: /lib64/libxcb-shm.so.0: undefined symbol: xcb_send_request_with_fds

After use with strace -e openat vmware and other trial-n-error steps, we have identified the issue is not with the library file mentioned in the error message “/lib64/libxcb-shm.so.0” but the preceding library file “libxcb” that is referencing this library file. We discovered a conflict with a VMware provided library file and the RHEL 9.5 OS library file. Below are the steps we followed to resolve this issue.

1) Confirm OS library file package is installed for "libxcb"

$ rpm -qa libxcb
libxcb-1.13.1-9.el9.x86_64


2) View both the VMware library file and OS library file for libxcb.


$ file /usr/lib/vmware/lib/libxcb.so.1/libxcb.so.1
/usr/lib/vmware/lib/libxcb.so.1/libxcb.so.1: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=eee92f55158e394bb77038017d22e8c603ddd2fc, stripped


$ file /usr/lib64/libxcb.so.1
/usr/lib64/libxcb.so.1: symbolic link to libxcb.so.1.1.0


$ file /usr/lib64/libxcb.so.1.1.0
/usr/lib64/libxcb.so.1.1.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=cdde44211c195a2f0b721e58af2796ababf05caf, stripped



3) Backup the VMware library file, and then create a softlink to the OS library file.

sudo mv /usr/lib/vmware/lib/libxcb.so.1 /usr/lib/vmware/lib/libxcb.so.1.bak

sudo ln -s /usr/lib64/libxcb.so.1 /usr/lib/vmware/lib/libxcb.so.1



4)  Now start vmware at the command line.

vmware

It should now start with no issue.

Below is a view of the current OS & kernel of RHEL 9.5 to compare with your environment.

$ cat /etc/os-release
NAME="Red Hat Enterprise Linux"
VERSION="9.5 (Plow)"
ID="rhel"
ID_LIKE="fedora"
VERSION_ID="9.5"
PLATFORM_ID="platform:el9"
PRETTY_NAME="Red Hat Enterprise Linux 9.5 (Plow)"
ANSI_COLOR="0;31"
LOGO="fedora-logo-icon"
CPE_NAME="cpe:/o:redhat:enterprise_linux:9::baseos"
HOME_URL="https://www.redhat.com/"
DOCUMENTATION_URL="https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/9"
BUG_REPORT_URL="https://issues.redhat.com/"
 
REDHAT_BUGZILLA_PRODUCT="Red Hat Enterprise Linux 9"
REDHAT_BUGZILLA_PRODUCT_VERSION=9.5
REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux"
REDHAT_SUPPORT_PRODUCT_VERSION="9.5"

$ uname -r
5.14.0-503.21.1.el9_5.x86_64

Optional: Address the “warning” messages about “<reset-dirs>” fonts and proxy.xml

5) Optional: Remove warning message about fonts and proxy.


5a) Create a new file with one line of <xml/> for  proxy.xml

sudo mkdir -p /etc/vmware/hostd

echo "<xml/>" | sudo tee /etc/vmware/hostd/proxy.xml



5b)  Edit the conf file and comment out the line for <reset-dir>

sudo vi  /usr/share/fontconfig/conf.avail/05-reset-dirs-sample.conf  

RHEL 9.5 upgrade steps (removed ‘lock’ of version with -unset switch)

sudo subscription-manager release –unset    [Repo were locked to previous version]
sudo subscription-manager repos --list-enabled
sudo dnf clean all
sudo dnf makecache
sudo dnf update       [This may take 30-60 min]
cat /etc/os-release
reboot

Download software via Help / Software Updates or use the embedded URL directly.

https://community.broadcom.com/vmware-cloud-foundation/discussion/rhel-95-and-vmware-workstation-startup-error-lib64libxcb-shmso0-undefined-symbol-xcb-send-request-with-fds

The Power of Multiple Incognito Windows

Custom User Data Directories and Advanced DNS/TLS Management

In the world of web development, testing, or any activity requiring precise browser session handling, juggling multiple configurations can quickly become overwhelming. Fortunately, modern browsers like Google Chrome offer powerful features that, when combined with a bit of command-line magic, can make your life significantly easier. Let’s dive into the usefulness of having multiple incognito windows using different user-data-dirs and managing TLS certificates with dedicated DNS mapping.

Why Multiple Incognito Windows?

Incognito mode is a great way to open a clean browser session without carrying over cookies, cache, or other data from your primary browsing experience. However, opening multiple incognito windows in a standard configuration doesn’t isolate sessions—they share the same incognito context. This is where the --user-data-dir flag comes in.

By specifying a unique--user-data-dirfor each incognito session, you’re effectively sandboxing your browser profiles. This is particularly useful for:

  • Testing Multi-User Applications: Simulate multiple users interacting with your web application without needing separate devices or browsers.
    • Very useful, to have an regular user accessing the same application when simultaneously using an admin ID with the same application on the same MS Windows host.
  • Isolating Session Data: Prevent session contamination when testing login states, cookies, or caching behavior.
    • High value – we can effectively have multiple sessions to the SAME application.
    • This is very important benefit.
  • Debugging Environments: Configure distinct profiles for staging, production, or development environments.
    • So useful when working with IP addresses that are not in the production DNS and we do not have access to MS Windows host file to create aliases for the IP addresses during testing.

See below examples for three (3) commands with different user data directories that will ensure these browser sessions do NOT share a session or session cookies. We are no longer limited to a single incognito session!!!

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" \
--incognito --user-data-dir="C:\Temp\ChromeSession1" https://www.example.com

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" \
--incognito --user-data-dir="C:\Temp\ChromeSession2" https://www.example.com

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" \
--incognito --user-data-dir="C:\Temp\ChromeSession3" https://www.example.com

Each session launched with a unique --user-data-dir path acts as a fully independent browser instance, complete with its own storage and settings. Chose a folder like C:\Temp where the Chrome browser will be able to create the data folder by itself and not impacted by any restrictive folder permission.

While we could run the above command line at will, however it may be more beneficial to store this string in the MS Windows shortcut, we can easily make as many Incognito sessions as we wish, attached to unique MS Windows shortcuts to the chrome.exe browser binary.

 

Managing TLS Certificates and DNS Mapping

When dealing with local development environments, you often need to work with IP addresses and TLS certificates. Usually you may have access to your local MS Windows host file, you can directly edit it to add the IP address with the hostname/FQDN that matches the CN (subject) or SANs of the TLS certificate.

However, if you do NOT have access to the MS Windows host file, usually we are stuck using the IP address in part of the URL and dealing with the TLS certificate warning messages. Chrome, by default, is strict about TLS and DNS, which can lead to frustrating “Your connection is not private” warnings. However, with a few additional flags, you can streamline this process.

DNS Mapping with Host Resolver Rules

The --host-resolver-rules flag allows you to map specific hostnames to IP addresses directly from the command line, bypassing the system’s DNS configuration. This is incredibly useful for testing domains that don’t have publicly resolvable DNS records or for redirecting traffic to a specific server in a development environment.

Example:

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" \
--host-resolver-rules="MAP *.iam-dev.us 192.168.2.102" \
--incognito --user-data-dir="C:\Temp\ChromeSession" \
https://www.iam-dev.us:8444/sigma

This maps any subdomain of iam-dev.us to the IP address 192.168.2.102 without needing to modify your system’s hosts file. The additional benefit is that we can map the hostname to match the CN (subject) of the TLS certificate, so we do NOT see any TLS certificate errors.

We now have NO issue viewing a certificate that matched our hostname we provided in the shortcut to the Chrome browser.

Below is a live example testing with Symantec Identity Portal (IGA) solution, where we have two (2) separate incognito sessions, with different –user-data-dir to ensure we have isolated the sessions. We also have used the –host-resolver-rules with an IP address that is not in our DNS. We have mapped this IP address in the chrome shortcut.

Note: A “warning message” will show when we use the –host-resolver-rule flag, to ensure we are not being “hijacked” by someone else. Since we are doing this effort, you may click the “X” and close this warning message.

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" --host-resolver-rules="MAP *.iam-dev.us 192.168.2.102" --incognito --user-data-dir="C:\Temp\ChromeSession"  https://www.iam-dev.us:8444/sigma  

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" --host-resolver-rules="MAP *.iam-dev.us 192.168.2.102" --incognito --user-data-dir="C:\Temp\ChromeSession2" https://www.iam-dev.us:8444/sigma/admin  

Handling Self-Signed Certificates

While the above worked for an environment with a proper TLS certificate, we need an alternative for environments using self-signed certificates, Chrome’s strict security measures can get in the way. Using flags like --ignore-certificate-errors (only in safe, controlled environments!) or configuring your system to trust these certificates can help.

Combined with the --user-data-dir option, you can even preload trusted certificates into specific profiles for a seamless workflow.

Here’s an example of a full command:

"C:\Program Files (x86)\Google\Chrome\Application\chrome.exe" \
--host-resolver-rules="MAP *.iam-dev.us 192.168.2.102" \
--incognito --user-data-dir="C:\Temp\ChromeSession" \
--ignore-certificate-errors \
https://www.iam-dev.us:8444/sigma

With this setup, you’re:

  • Running an isolated session with its own user data directory.
  • Redirecting specific DNS queries to a development server.
  • Accessing a self-signed certificate-protected URL without unnecessary interruptions.

Conclusion

By leveraging Chrome’s --user-data-dir, --host-resolver-rules, and other advanced flags, you can tailor your browser environment to handle complex workflows with ease. Whether you’re a developer, tester, or IT specialist, these tools offer a robust way to manage multiple configurations, ensuring efficient and error-free operations.

So next time you’re troubleshooting or testing, remember: the right Chrome flags can be game-changing. Experiment with these options and streamline your workflow today!

Bonus: View of active switches

Use chrome://version to view the active command line switches in use.

Simplify Passkey Registration with your mobile phone

You’re probably familiar with those emails that claim to be from your bank, CPA, or services like Microsoft and Google. They use deceptive tactics, such as subtly altered URLs and stolen branding, to trick you into entering your credentials on fraudulent sites. If you’re tired or distracted, it’s easy to fall for these schemes, which can put you at serious risk.

The great advantage and the “beauty” of passkeys is their ability to render these phishing attempts ineffective. By design, passkeys prevent remote phishing through email or text from achieving their goals, ensuring your credentials stay secure.

Why this works?

The “magic” or brilliance behind this innovation lies in the collaborative efforts of the industry at large, spearheaded by the FIDO Alliance, to establish a unified “standard.” This unified standard leverages public/private key functionality in a way that is user-friendly, making it easier for individuals to secure their remote accounts.

This new standard is already compatible with browsers on laptops and mobile devices. However, because it’s still relatively new, adoption among banks and financial institutions has been limited. For instance, I currently have only one financial institution using this standard. Most organizations still rely on two-factor authentication methods, combining a password with either an SMS code or an authentication app. That said, major global companies like Google and Microsoft are leading the charge in rolling out passkeys, and it’s only a matter of time before broader adoption follows.

How this works?

Background: The public-private key process (asymmetric cryptography) has evolved significantly, especially between 1990 and 2010, to enable secure online access, such as shopping and banking. Here’s how it works: A shopping or banking website hosts its public certificate (visible to anyone). When you access the site, it uses its private key (accessible only to the site) to validate this certificate. If the public and private keys match, the connection is secure, and you can confidently proceed with your transactions.

New Phishing Risks: Cyber-criminals (aka “bad-guys”) have become adept at creating convincing fake websites with their own public/private keys, paired with stolen images and content to appear legitimate. For years, we’ve been trained to trust the lock symbol in the URL bar as an indicator of a secure connection. However, this is no longer foolproof. For instance, a malicious site could use subtle tricks, like replacing the letter “o” with a zero in “google.com,” to deceive users. These tactics highlight the need to go beyond basic visual indicators and “raise the bar” on security to ensure our online activities remain safe.

Raise the Bar: With passkeys, the private key stays securely on your device (e.g., phone, workstation, physical security key, or tablet), while the public certificate is stored exclusively on the site where you registered. Authentication can only occur when the public certificate and the private key’s signature match, rendering interception attempts useless. This significantly raises the security “bar” against malicious remote access attempts.

That said, passkeys are still relatively new to global adoption, and their rollout is ongoing. While there are challenges, such as addressing backup and replication, these should not deter you from leveraging this technology to enhance your personal security.

Register a ‘passkey’ with your mobile phone.

Why? We want the bigger screens of private/public laptop/workstations and the security of the ‘passkey’ functionality with mobility.

If you are accessing a website from your phone, you will have no issue registering the passkey, because all components on your phone are trusted with each other, e.g. hardware chip to store the passkey (private key), the middle-ware software (SamSung Pass/Google Passkey/Apple Passkey), and your mobile browser.

The current challenge is using your mobile phone with your private laptop/workstation. This seems to be a “glossed” over area in current online docs. There is the assumption that all users have tied their “mobile phones” with their current private workstation. Unfortunately, this is a “gotcha” with using ‘passkeys”

Per the standard, Bluetooth is used as the primary communication between your mobile phone and the public/private laptop/workstation.

You can check if you have Bluetooth setup on your private laptop/workstation by using the built-in file copy operation. Below is a view of using this feature.

The mobile phone does have to advertise itself via Bluetooth first. The mobile phone does NOT have to be trusted first with the workstation. The operation can be approved when it occurs. This is similar how ‘passkey’ will work with your workstation (public or private).

For registration with ‘passkey’ , I have found through trial-and-error, that what worked best, was to previously trust (aka “paired“) the mobile phone as a Bluetooth device first. Add & pair your mobile phone, as you would your keyboard or mouse that has Bluetooth functionality.

Ignore these false messages – Choose another device

Now, when you hit a site via a browser on your workstation, you can now continue even if you see a negative message that states “A passkey can’t be created on these device”. These message are only focused on your workstation (if it alone can support the ‘passkey’). We don’t care about these workstations at this time. Especially, if you are eventually going to be traveling and want the passkey on your mobile phone.

Look for the button or message that states “Use another device” or similar message.

Example of ‘passkey’ registration. After you select your mobile phone, most websites will then offer a QR code to generate a new unique private-public key combo (aka the ‘passkey’) only for your account on their site. Use your phone, that has previously been Bluetooth paired to your private laptop/workstation, to register the ‘passkey’.

After you have registered the ‘passkey’ you can now leverage it from both your mobile web site, as well as from any workstation/laptop, including public workstations. You should see a popup box on your mobile phone via Bluetooth to authenticate with your ‘passkey’

Now, even if you still get a phishing email, and you inadvertently click on it. When the false-site asks for a password and you know you have a ‘passkey’, you can stop the process in its track.

Please be aware, that your original password is still on the proper valid site. It has NOT been removed due to you adding a ‘passkey’ for authentication. If you lose your mobile phone, you most likely will need to re-register with a new ‘passkey’ with a new mobile phone. The FIDO alliance is working with vendors to allow possible recovery, but some may argue that while useful, it defeat the purpose of ‘passkey’ that ONLY you have access to. You can catch up on what the latest recommendations are.

As my backup to my mobile phone, I am a fan of the YubiKey 5C NFC. This model has room for 100 ‘passkey’s and with the USB-C and NFC features, I can use it with any mobile phone or laptop. Adding a long PIN as well, so it fits the “something you know” + “something you have”. Similar to a mobile phone as the holder of the ‘passkey’.

Yubikey’s long info-graph document about the differences between ‘passkey’ types was useful.

Help your family, friends, and neighbors get on this ‘passkey’ band wagon.

Hopefully, knowing how to manage the Bluetoothgotcha‘ for registration, will allow you to use your mobile phone more effective with your laptop. You can still use Authentication Apps + ‘password’ as your intermediate authentication until your financial institutions and others switch over to offer ‘passkey’ as an option.

To learn more you may wish to review the longer blog entry.

Benefits of Passkeys: Stop Man-in-the-Middle / Phishing Attacks – ANA Technology Partner

Optional: Testing with an external site

A useful site to test ‘passkey’, webauthn.io.

After creating your ‘passkey’, use the website again, but do not enter your username. Test with Authenticate button to let the browser work with your workstation, to pick a passkey associated with the website.

A view of the ‘passkey’ on a Yubikey via command line tool (ykman):

Fallback Registration – USB

If you still have an issue with registration with Bluetooth, fall back to using USB-C on your mobile phone connected to your private workstation/laptop. Please ensure that you do have a “passkey” management middle-ware app on your phone enabled to work with the vendors’ website.

Example: Microsoft seems to prefer their Microsoft Authenticator to hold the “passkey” associated with your Microsoft email account (work or school or otherwise). The below image show I have many choices for the middle-ware application. OOTB on my Samsung mobile phone is Samsung Pass. But you can change this at will when you are registering your “passkey”. Recall that the actually “passkey” will still be stored in the hardware of the mobile device.

Within Microsoft Authenticator, when you select your MS email address (work/personal), you should be able to see a “passkey” option. My previous attempts at registration using Bluetooth failed, but when I used the USB-C cable, then I had no issue.

Benefits of Passkeys: Stop Man-in-the-Middle / Phishing Attacks

One of the primary advantages of using a passkey (certificate-based-authentication) over passwords, is to defeat the man-in-the-middle / phishing attacks.

You know those emails that your receive, that pretend to be from your bank, your CPA, or Microsoft/Google for your email. They have intentionally malformed URL addresses to trick you to enter your password into their own site, that have stolen vendor images to make it look “real”. Well, if you are tired or busy, and not paying attention, ouch, you have just put yourself at major risk by clicking that link and entering your credentials.

Lets stop this nonsense & risk now. Help your friends and family members as well.

Why this works? The magic is the industry has agreed on a “standard” to use public-private key functionality and make it more friendly for end-users to use with their laptop’s browsers and phones. Since this “standard” is fairly new, you will only see a few banks using it. I have only one bank of three using it. It is being rolled out at most global companies, e.g Google, Microsoft, etc.

How this works? The passkey’s private key remains on your device (phone/workstation/physical security key/tablet), the passkey public cert is ONLY on the one site you registered at. Only the correct public cert can verify the private key’s signature, making interception useless. Yea us!

Please be aware that passkeys (as a process) are still relative new for global usage, and this functionality is being rolled out. There are challenges that are being address for backup/replication but don’t let that stop you from adoption to enhance your own personal security.

The good:

I had great luck with using a physical security USB device, the Yubikey 5C NFC , for passkey(s), ECA certificates, and standard one-time token(s) (touch). Also, if I used an app or website on my mobile phone that supported passkeys, I had no challenges with registration and using passkeys with same phone. This was all good. I thought I should be able to do this same passkey registration with my laptop with any browser or integrate with my mobile phone as is mentioned in many online sites.

The bad:

However I became very frustrated. I wanted to use passkey(s) on my laptop/workstation natively or along with a mobile phone. Why was this so hard? I wanted this functionality for my email and for any other online work while I was sitting at my desk. I did not want to use the small screen on the mobile phone to conduct my business.

I would see these types of scary error messages when I attempt to register a passkey on my workstation.

I had to dive deeper and see why I had an issue with my workstations. An why with a slight change, I will continue to use my mobile phone or Yubikey for passkeys.

Testing Passkey (aka FIDO2) Functionality:

Pretest your workstation to see if you can use passkeys. Use the below site to test: https://webauthn.io/ Enter a random string in the edit box, click Register, then click Authenticate. If you have no issues, you are well on your way to using passkeys.

With the Firefox browser, you can try this second site. https://webauthn.bin.coffee/ It is a deeper review but shows a similar process with “Create Credential” and “Get Assertion”. If this passes, please continue.

Hardware TPM 2.0

My first failure, was due to using an older laptop/workstation. You must have a relatively new laptop that has TPM 2.0 in the BIOS. Unfortunately a BIOS upgrade will not resolve this. Passkey(s) require the newer hardware security functionality within TPM 2.0. Time to upgrade your laptop.


Check your version with MS Windows’ Device Manager or PowerShell command line, and type Get-TPM

My second issue was the OS I was using on my updated laptop/workstation that had TPM 2.0. Passkey architecture requires supported hardware (to store the key securely), middle-ware management software, and user interaction (via browser or other). For example, MS Windows OS version 10/11’s middle-ware management software is called “Windows Hello” (aka WebAuthn) that will interact between the browser (user) and the hardware (where the passkey is stored).

However, on my newest laptop, I am running MS Windows 2019 Standard OS, as I find it more stable for testing solutions. On this OS, “Windows Hello” feature set is not fully enabled. While I could enable it via MS Registry entries or a group policy (see below), I decided to stop here, and focus on the mobile phone authentication for the workstation. The underlying OS functionality will work as-is with passkeys on mobile phones, but there was a hitch.

Mobile Phone

As I reviewed through the online documentation and specs, it is clear that with a mobile phone (acting as the passkey storage with a built-in security chip) should be able to communicate to any workstation (public/private) and provide the passkey over Bluetooth when asked. However, seeing is believing, and I only was successful with passkey registration on my private workstations only after I trusted the mobile phone with the workstation.

Typically, on a workstation with MS Windows, you may use Bluetooth for keyboard/mouse/audio headsets. You may also use it for file transfer between the workstation and other devices, e.g. mobile phones. The FIDO2 architecture uses Bluetooth as well, but with their own protocol.

Before I trusted the phone with the workstation, I would see attempts to use my mobile phone from the workstation, but it would eventually fail. Perhaps there is an automated features that i needed to enabled that would allow this. I enabled Bluetooth trust between the mobile phone and the workstation prior to trying again.

Success!

If the application or website offers passkey(s) as authentication, please go ahead. Ignore any error/warning messages that may say your “device” (aka workstation/laptop) does not support passkey. Select “another device” if it is offered, then select your mobile phone. You should be able to progress and register a passkey on your phone.

Below are images from the phone, that are generated due to the Bluetooth “trust” with the workstation, when I select a passkey to be generated and stored on my mobile phone.

Now that your passkey is registered, you should be able to use any public workstation or other with your mobile phone, and not worry about your password being compromised. 🙂

Take-aways:

  1. Use passkey(s) as your primary authentication (if the website or app allows it)
    • When using your mobile phone, ensure Bluetooth is enabled and trusted to your non-public workstation (to assist with initial passkey registration)
    • Consider using a physical security key with PIN, to hold the passkeys. Ensure this device has USB-C and/or NFC to allow you to use it with any modern workstation/ipad/mobile phone.
  2. Use an authentication app (MS/Google/LastPass/Yubikey/Symantec VIP/RSA Auth/etc.) as secondary authentication option with or without your password .
    • Best if the site allow you to use only an Authenticator but most sites will require a password as 1st credential (at this time/ not perfect but better than just password alone).
    • Please note that you may be able to have as many authentication app as you like on some websites.
  3. Use your password and SMS text as a third option for authentication to have a minimal of two (2) factor authentication. This is the minimal two (2) factor. We want better.
  4. If you have only password authentication, use some form of password management tool, e.g. KeePass, LastPass, 1Password, etc. and make the password as long as possible, e.g. 100 characters. Let’s see someone brute force that. See below table of examples
ToolFeaturesStrengthsConcernsBest ForPricing
KeePassOpen-source, offline storage, extensive plugins, no cloud dependency.Highly secure, customiz-able.Manual syncing for multiple devices.Privacy-focused and advanced users.Free (open-source).
Text File Secured by VeraCryptOpen-source, offline storage, encrypted container for storing sensitive text files, open-source encryption tools.Fully offline, highly secure.Manual password entry; no automation.Privacy-focused users.Free (open-source).
LastPassCloud-based vault, password sharing, MFA, dark web monitoring.User-friendly interface.Previous data breaches.Personal and family use.Free version; & Premium version
DashlaneCloud-based vault,
Password generator, VPN, dark web monitoring, autofill for payment details.
Advanced security features.Expensive compared to others.All-in-one solution seekers.Free version (limited); & Premium version
1PasswordCloud-based vault,
Travel mode, item-specific sharing, password health analysis, advanced MFA support.
Great for families and teams.No free plan (trial available).Families and advanced users.No Free version
BitwardenCloud-based vault, Open-source, self-hosting option, MFA, password generator.Transparent and affordable.Less intuitive interface.Tech-savvy and budget-conscious.Free version; & Premium version
KeeperZero-knowledge encryption, secure file storage, breach monitoring, advanced MFA.Enterprise-grade security.More expensive than alternatives.Professional and businesses.No Free version
RoboFormSimple password management, secure sharing, offline access.Affordable and reliable.Limited advanced features.Casual users.Free version; & Premium version
NordPassPassword health tools, zero-knowledge encryption, cross-device sync.User-friendly, good for beginners.Fewer advanced features.Users in Nord ecosystem.Free version; & Premium version
Zoho VaultPassword sharing, role-based access, Zoho app integration, MFA support.Affordable for teams.Less intuitive for individual users.Small businesses and Zoho users.Free personal plan; & Paid version

Other useful knowledge found during research

MS O365 Enable Passkey Functionality

This was interesting from an administrative view. There was only one type of passkey functionality for MS O365, and it was buried within the MS Authenticator. Now it looks like there is native support as well.

To enable Passkey (FIDO2) within O365, we had to go the admin console and enable four (4) switches.

Firefox Browser Debugging

It is impressive to see many parameters for Firefox to help isolate an issue.

Passkey versus SSH Key Table

I wanted to compare the similarities and the differences between passkey architecture and ssh keys (used for many years). This table summarizes the distinctions and overlaps between passkeys (a modern, browser-driven standard for web authentication) and SSH keys (a traditional tool for server authentication). This may help others to see how the evolution has progressed from behind the scenes with servers to public use with browsers.

Specs to review

https://w3c.github.io/webauthn

Expired Certs? Oh My!

How to Avoid the SSL Expiration Apocalypse

You’re minding your own business, sipping coffee, feeling invincible—when BAM! The website goes down. The boss storms in like an angry bear who just lost Wi-Fi. The culprit? An expired SSL certificate. Congratulations, you’ve just unlocked the IT version of public humiliation.

But fear not! The ssl-cert-expiration-date-check script is here to save you from a life of shame and awkward team meetings. Think of it as the superhero you never knew you needed, armed with OpenSSL and a knack for keeping your certificates alive and kicking.

What’s This?

SSL monitoring via bash shell. Here’s how it works:

  • It reads a list of FQDNs or IPs (fancy IT speak for “things you probably Googled how to find”) from a file named fqdn_list.txt.
  • It interrogates your endpoints like a bad cop in a detective movie, using OpenSSL binary to spill the beans on their SSL certificates.
  • It then writes the juicy details into certs_info.csv—because nothing says “I’m a professional” like a CSV file with MS Excel.

Boom. You now know when your certificates will expire. No more flying blind. No more angry bosses. No more soul-crushing outages.

Why Should You Care?

Let me paint you a picture: An expired SSL certificate means users see a terrifying “This site is NOT secure” warning. It’s basically the internet screaming, “Run away!” Your customers? Gone. Your reputation? Sinking faster than your confidence in this job.

But this script? It’s the anti-drama. It supports everything: HTTPS, LDAPS, JDBC/S, and even those obscure protocols no one dares to ask about. It logs every certificate—server, intermediate, maybe even root CA certificates if they’re feeling generous. It’s like an all-you-can-eat buffet of SSL info.

Did You Know?

  • Some certificates, like rebellious teenagers, don’t come pre-installed in your keystore. You’ve got to manually invite them to the party.
  • On Windows, this means opening certlm.msc (don’t worry, it’s not as scary as it sounds).
  • If you’re in Java Land, you’ll need to charm the keytool utility. It’s like convincing a cat to sit still—it’s tricky, but doable.

How to Become an SSL Wizard

  • Step 1: Open a text editor and create fqdn_list.txt. Add all your endpoints and ports, one per line. It’s like making a party guest list, but with less glitter.
  • Step 2: Run the script. Sit back. Look cool.
  • Step 3: Open the certs_info.csv file. Admire your work. Maybe print it out and frame it for the office wall.

The Moral of the Story

Neglect your SSL certificates, and the internet will publicly shame you. But with the ssl-cert-expiration-date-check script, you’ll avoid the chaos, the browser warnings, and the boss’s death stare.

So, download this script, save yourself, and become the hero your IT department deserves. Because nothing says “I’ve got this” like preventing a preventable disaster. Now, go forth and conquer the world of SSL certificates—preferably before your coffee gets cold.

The Location

https://github.com/anapartner-com/ssl-cert-expiration-date-check

View the Readme, play with the scripts, provide feedback. Integrate this process with your SaaS monitoring solutions, e.g. Syslog (with Splunk), Broadcom DX02 (APM), Grafana, Dynatrace, etc. Or use remote ssh to execute the process to secure internal network segments to query those certs as well.

The Magic of openssl binary

openssl s_client -connect "$FQDN:$PORT" -showcerts 2>/dev/null > "temp_output.txt"


EXPIRATION_DATE=$(openssl x509 -enddate -noout -in "$CERT_FILE" 2>/dev/null | cut -d= -f2)
SUBJECT_NAME=$(openssl x509 -subject -noout -in "$CERT_FILE" 2>/dev/null | sed 's/subject= //')
SERIAL_NUMBER=$(openssl x509 -serial -noout -in "$CERT_FILE" 2>/dev/null | cut -d= -f2)

We use the openssl s_client process to connect and return all possible certs from the endpoints’ IP or FQDN with its port and save this information to a temporary file for us to review for the metadata.

We obviously want “enddate”, to find the expiration date of the certificates. But we also want both the “subject (aka CN)” and “serial” of the certificates to avoid the all-too-common challenge during rotation of certificates where we have the SAME name (subject/CN) for the same root ca cert. This is a very annoying challenge and will add unnecessary effort during RCA (root-cause-analysis) efforts to identify the full certificate chain. The “serial” number will help us avoid this confusion.

Measuring What Matters: Metrics for Thriving Businesses

Metrics are essential for monitoring and optimizing the health of your solutions. The simplicity of using SaaS-based Application Performance Monitoring (APM)/Operational Intelligence/Analytics tools makes them indispensable for gaining actionable insights.

By leveraging metrics, you can not only ensure the performance and reliability of your systems but also build compelling ROI use cases. We can leverage these SaaS platforms to incorporate ROI queries to pull forward data that is not exposed in other dashboards.

In this guide, we’ll demonstrate the power of metrics by deploying a Broadcom DX O2 agent to the Symantec IGA Virtual Appliance in under 10 minutes, providing immediate value and visibility into your business operations. This straightforward process integrates seamlessly into your existing infrastructure, enhancing the observability and security of a hardened appliance.

This walk-through will showcase how metrics can enhance the observability and security of a hardened appliance.

Steps to leverage the DX O2 Java Agent:

Step01: Login to your instance of DX O2 (Operational Observability)

Step02: Download Agents

After you login to your DX OI/O2 instance, navigate to the settings/agents section. You can select an Agent, and your custom authentication token to use it will be embedded in the package. We plan to use the javaagent being offered for Wildfly (aka JBOSS) . Select this agent.

When the screen displays, expand the “Command Line Download“. We will use the wget command to directly download this agent to our Virtual Appliance that has internet access. Otherwise, download the agent to your workstation, and then file transfer it to your Virtual Appliance that has Wildfly running on it.

Step03: Login to the IGA Virtual Appliance with ssh.

Create a local media folder, then proceed to download the DX 02 agent. After the download is successful, we will extract the agent into a known folder used on the IGA Virtual Appliance for “java profilers”. Since the files are owned by the ‘config’ user, and we need the ‘wildfly’ user to have access to the log folder, please chmod 777 to both log folders to avoid any startup issues for the Wildfly applications. You may leave the rest of the file/folder permissions as is.

mkdir -p ~/media/dxoi_agent ; cd ~/media/dxoi_agent/

wget --content-disposition "https://apmgw.dxi-na1.saas.broadcom.com/acc/apm/acc/downloadpackage/XXXXXXXXXXXXXXXX?format=archive&layout=bootstrap_preferred&packageDownloadSecurityToken=ZZZZZZZZZZZZZZZZZZZZZZ"

ls -lart

tar -xf JBoss_jboss_20241117_v1.tar -C /opt/CA/VirtualAppliance/custom/profiler/

cd /opt/CA/VirtualAppliance/custom/profiler/

ls -lart

cd wily/

ls -lart

# We could update permissions for all files/folder to 777, but we only need the following to be changed.
# Update permission for folders that 'wildfly' will write out to.
chmod 777 logs/  ./releases/24.10/logs/
chmod 777 ./releases/24.10/core/config/hotdeploy
chmod 777 ./releases/24.10/extensions
mkdir -p ./releases/24.10/core/config/metrics
chmod 777 ./releases/24.10/core/config/metrics

After updating the logs, hotdeploy, extensions, metrics folders’ permissions, please run the shell script “./agent-location.sh”. This script will output the JVM arguments that we will use with the Wildfly instances for IdentityManager, IdentityPortal, and IdentityGovernance

./agent-location.sh

/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config/IntroscopeAgent.profile -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10

We will now edit the jvm-arg.conf files for both IdentityManager and IdentityPortal for this scenario with the above string. We will prepend the string “javaagent:” and to avoid an Java Log Module loading order error, we will place the entire string at the very end of the JAVA_OPTS variable. We can use the same exact string and path, as the service name of each instance will be automatically determined by the javaagent.

Below is a view of what the IM and IP jvm-args.conf file should look like. Please ensure the full string is at the very end.

cat /opt/CA/VirtualAppliance/custom/IdentityManager/jvm-args.conf

# Add the Broadcom DX-O2 Javaagent to Identity Manager
JAVA_OPTS=-Xms512m -Xmx4096m -XX:+UseG1GC -XX:+UseStringDeduplication -XX:+UseCompressedOops  -Djava.net.preferIPv4Stack=true   -XshowSettings:properties  -DLog4jContextSelector=org.apache.logging.log4j.core.selector.BasicContextSelector -javaagent:/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config/IntroscopeAgent.profile -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10


cat /opt/CA/VirtualAppliance/custom/IdentityPortal/jvm-args.conf

# # Add the Broadcom DX-O2 Javaagent to Identity Portal
JAVA_OPTS=-Xms512m -Xmx1512m -XX:+UseG1GC -XX:+UseStringDeduplication -XX:+UseCompressedOops -Djava.net.preferIPv4Stack=true -javaagent:/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config/IntroscopeAgent.profile -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10

Now stop and start both IdentityManager and IdentityPortal. We recommend using a second ssh session to monitor the wildfly-console.log for each, as it will immediate show any issues due to permissions or other with the java-agent.

stop_im ; start_im

tail -F /opt/CA/wildfly-idm/standalone/log/wildfly-console.log


stop_ip ; start_ip

tail -F /opt/CA/wildfly-portal/standalone/log/wildfly-console.log

Step04: We are Done. View the DX O2 UI and review the new incoming data.

Recommend that we walk through all the possible pre-built dashboard to use and monitor/alert on your solution. Of interest, is the IM shows as the hostname of the Virtual Appliance “vapp1453” and IP shows as the internal pseudo name of “IPnode1”. Note, these values can be over-written in the profile file.

A view of metrics by each agent. You must click on each sub-category to see what is being offered.

A very interesting view within the memory space of the IdentityManager application

Other views to review

What is very interesting, is adding ROI metrics to dashboards, where we can monitor the number of events that are being utilized, e.g. external customer access, internal password changes. The APIs allowed provide maximum flexibility to input directly any ROI metrics we wish.

Reach out and we will work with you to get the most value out of your solution.

Additional Notes

JVM Order Management

On the IGA virtual appliance, the order of JVM switches for “LogManager” is predetermined. If the new javaagent is not placed at the very end of the JAVA_OPTS, we may see these generic warn/error messages. We spent quite a bit of time being mislead by these generic warning/error messages. We did not need to add extra JVM switches to manage the JVM order. If you do have challenges, review the current documentation for the JBOSS agent.

WARNING: Failed to load the specified log manager class org.jboss.logmanager.LogManager

ERROR: WFLYCTL0013: Operation ("parallel-extension-add") failed - address: ([])
Caused by: java.util.concurrent.ExecutionException: java.lang.IllegalStateException: WFLYLOG0078: The logging subsystem requires the log manager to be org.jboss.logmanager.LogManager. The subsystem has not be initialized and cannot be used. To use JBoss Log Manager you must add the system property "java.util.logging.manager" and set it to "org.jboss.logmanager.LogManager"

FATAL: WFLYSRV0056: Server boot has failed in an unrecoverable manner; exiting. See previous messages for details.

Bonus Round: Deploy the Java Agent for the JCS component

While the javaagent for both Wildfly and Java are the same, the support modules are slightly different. We may be able to combine them, but to avoid any possible concerns, we seperated the extraction folders. Add this javaagent string to the JCS JVM custom configuration file: jvm_options.conf

~/media/dxoi_agent > ls -lart

-rw-r--r-- 1 config config 30803968 Nov 16 21:18 JBoss_jboss_20241117_v1.tar


~/media/dxoi_agent > wget --content-disposition "https://apmgw.dxi-na1.saas.broadcom.com/acc/apm/acc/downloadpackage/7XXXXXXXXXX?format=archive&layout=bootstrap_preferred&packageDownload                                 SecurityToken=ZZZZZZZZZZZ"

~/media/dxoi_agent > ls -lart

-rw-r--r-- 1 config config 30803968 Nov 16 21:18 JBoss_jboss_20241117_v1.tar
-rw-r--r-- 1 config config 31645184 Nov 16 23:51 Java_other_20241117_v1.tar

~/media/dxoi_agent > tar -xf Java_other_20241117_v1.tar

~/media/dxoi_agent > ls -lart

-rw-r--r-- 1 config config 30803968 Nov 16 21:18 JBoss_jboss_20241117_v1.tar
-rw-r--r-- 1 config config 31645184 Nov 16 23:51 Java_other_20241117_v1.tar
drwxr-xr-x 4 config config      123 Nov 16 23:51 wily

~/media/dxoi_agent > mv wily/ /opt/CA/VirtualAppliance/custom/profiler/wily-jcs

~/media/dxoi_agent > cd /opt/CA/VirtualAppliance/custom/profiler/wily-jcs

/opt/CA/VirtualAppliance/custom/profiler/wily-jcs > ls -lart

drwxr-xr-x 2 config config     6 Nov 16 23:41 logs
-rw-r--r-- 1 config config     5 Nov 16 23:41 agent.release
-rwxr-xr-x 1 config config  1371 Nov 16 23:41 agent-location.sh
-rwxr-xr-x 1 config config  1138 Nov 16 23:41 agent-location.bat
-rw-r--r-- 1 config config 45258 Nov 16 23:41 Agent.jar
drwxr-xr-x 3 config config    19 Nov 16 23:51 releases

# We could update permissions for all files/folder to 777, but we only need the following to be changed.
# Update permission for folders that 'wildfly' will write out to.
chmod 777 logs/  ./releases/24.10/logs/
chmod 777 ./releases/24.10/core/config/hotdeploy
chmod 777  ./releases/24.10/extensions
mkdir -p ./releases/24.10/core/config/metrics
chmod 777 ./releases/24.10/core/config/metrics


/opt/CA/VirtualAppliance/custom/profiler/wily-jcs > ./agent-location.sh

/opt/CA/VirtualAppliance/custom/profiler/wily-jcs/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily-jcs/releases/24.10/core/config/IntroscopeAgent.profile -Dintroscope.agent.bo                                 otstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily-jcs -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10
cat /opt/CA/IdentityManager/ConnectorServer/data/jvm_options.conf

-server -Xms128M -Xmx1024M -Djava.awt.headless=true -Dcom.sun.management.jmxremote -javaagent:/opt/CA/VirtualAppliance/custom/profiler/wily-jcs/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily-jcs/releases/24.10/core/config/IntroscopeAgent.profile -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily-jcs -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10

Below is a view of the JCS agent with DX O2 UI. We see it is by itself under “Java”. Also note a challenge with two (2) Wildfly (JBoss) instances using the same profile with the default “agentName=JBoss Agent”. These Wildfly instances were automatically named upon startup, but after awhile the static name in the profile took precedence. See more information below

Challenge with default naming convention

When we have two (2) or more applications using the same profile, we may see DX O2 attempt to join them together in the metrics UI. To avoid this, lets make two (2) copies of the Introscope.profile and add our own “agentName” for each. Do NOT forget to comment out the default of “introscope.agent.agentName=JBoss Agent”. We added “com.wily.introscope.agent.agentName” as well, since it is called out in the online documentation.

Observations: The IP deployment honors the new value provide immediately. The IM deployment claims in the DX O2 agent logs that “Unable to automatically determine the Agent Name because: The Application Server naming mechanism is not yet available.” It defaults to the hostname of the Virtual Appliance after a few minutes, then it appears that it will reset to the correct agentName later.

/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config > head IntroscopeAgent.ip.profile
###############################################################################
# Add name for IP application
###############################################################################
com.wily.introscope.agent.agentName=IP
introscope.agent.agentName=IP

/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config > head IntroscopeAgent.im.profile
###############################################################################
# Add name for IM application
###############################################################################
com.wily.introscope.agent.agentName=IM
introscope.agent.agentName=IM
cat /opt/CA/VirtualAppliance/custom/IdentityManager/jvm-args.conf

# Add the Symantec/Broadcom DX-OI/2 Javaagent to Identity Manager
JAVA_OPTS=-Xms512m -Xmx4096m -XX:+UseG1GC -XX:+UseStringDeduplication -XX:+UseCompressedOops  -Djava.net.preferIPv4Stack=true   -XshowSettings:properties  -DLog4jContextSelector=org.apache.logging.log4j.core.selector.BasicContextSelector -javaagent:/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config/IntroscopeAgent.im.profile  -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10


cat /opt/CA/VirtualAppliance/custom/IdentityPortal/jvm-args.conf

# # Add the Symantec/Broadcom DX-OI/2 Javaagent to Identity Portal
JAVA_OPTS=-Xms512m -Xmx1512m -XX:+UseG1GC -XX:+UseStringDeduplication -XX:+UseCompressedOops -Djava.net.preferIPv4Stack=true -javaagent:/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/Agent.jar -Dcom.wily.introscope.agentProfile=/opt/CA/VirtualAppliance/custom/profiler/wily/releases/24.10/core/config/IntroscopeAgent.ip.profile -Dintroscope.agent.bootstrap.home=/opt/CA/VirtualAppliance/custom/profiler/wily -Dintroscope.agent.bootstrap.release.version=24.10 -Dintroscope.agent.bootstrap.version.loaded=24.10

Per online documentation, we have other renaming options as well

# Using JVM -D environmental switches, we can set one of these two
# -D JVM switches
# -DagentName=IM
# -Dcom.wily.introscope.agent.agentName=IM


# Within the Instroscope.profile  configuration file, we have these options.
# Allow Introscope to pickup a JVM environmental value from a pre-existing -D variable
introscope.agent.agentNameSystemPropertyKey=jboss.node.name

# Static name for the agent
introscope.agent.agentName=IP

# Allow the Introscope agent to append an integer to the name, used for clusters, e.g. JBoss Agent-1, JBoss Agent-2
introscope.agent.clonedAgent=true

The deployments for IP and JCS had no issues using any of the above, the IM application only responded well to the cloneAgent example. This is likely due to how the LogManager modules are ordered in the startup files that ‘config’ service ID does not have write access to modify.

Log Folder Cleanup

The ‘config’ service ID owns this folder, and even though there are files owned by ‘wildfly’, the ‘config’ user can delete these files.

ROI Metrics

Enable the API section of your DX O2 instance to create your ROI Metrics Input

Access the Swagger-UI via these directions.

Again, please reach out and we will work with you to get the most value out of your solution.

Did I Run That? A Bash History Adventure

photography of opened book

On large project teams, multiple members may often use the same hosts simultaneously. Alternatively, you might prefer to maintain multiple SSH sessions open on the same host—one for monitoring logs and another for executing commands. While a Linux host using the Bash shell records command-line history, the default settings can pose challenges. Specifically, they may result in the loss of prior history when multiple sessions access the same host.

To address this, you can make some enhancements to your configuration. On the Symantec IGA Virtual Appliance, we typically add these improvements to the .bashrc files of the config, dsa, and imps service IDs. These adjustments ensure the preservation of command history for all work performed. Naturally, it is also important to clean up or remove any sensitive data, such as passwords, from the history.

Below, we explore an optimized .bashrc configuration that focuses on improving command history management. Key features include appending history across sessions, adding timestamps to commands, ignoring specific commands, and safeguarding sensitive inputs.

Optimized .bashrc Configuration

Here’s the full configuration we’ll be exploring:

# Added to improve history of all commands
shopt -s histappend
export HISTTIMEFORMAT='%F %T '
export HISTSIZE=10000
export HISTFILESIZE=100000
export HISTIGNORE='ls:history'
export HISTCONTROL=ignorespace
export PROMPT_COMMAND='history -a; history -c; history -r'

Detailed Explanation of the Configuration

shopt -s histappend

Ensures that new commands from the current session are appended to your history file instead of overwriting it. This prevents accidental history loss across sessions.

export HISTTIMEFORMAT='%F %T '

Adds a timestamp to each command in your history, formatted as YYYY-MM-DD HH:MM:SS.

export HISTSIZE=10000

Limits the number of commands retained in memory during the current session to 10,000.

export HISTFILESIZE=100000

Configures the maximum number of commands saved in the history file to 100,000.

export HISTIGNORE='ls:history'

Excludes frequently used or less important commands like ls and history from being saved, reducing clutter.

export HISTCONTROL=ignorespace

Prevents commands that start with a space from being saved to history. This is particularly useful for sensitive commands like those containing passwords or API keys. When we copy-n-paste from Notepad++ or similar, remember to put a space character in front of the command.

export PROMPT_COMMAND='history -a; history -c; history -r'

Keeps history synchronized across multiple shell sessions: history -a appends new commands to the history file, history -c clears the in-memory history for the current session, and history -r reloads history from the history file.

Symantec IGA Virtual Appliance Service IDs

with the .profile or .bash_profile and .bashrc file(s).

We can see that the default .bash_profile for ‘config’ service already has a redirect reference for .bashrc

config@vapp1453 VAPP-14.5.0 (192.168.2.45):~ > cat .bash_profile
# .bash_profile

# Get the aliases and functions
if [ -f ~/.bashrc ]; then
        . ~/.bashrc
fi

# User specific environment and startup programs

config@vapp1453 VAPP-14.5.0 (192.168.2.45):~ > cat .bashrc
# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
        . /etc/bashrc
fi

# User specific environment
if ! [[ "$PATH" =~ "$HOME/.local/bin:$HOME/bin:" ]]
then
    PATH="$HOME/.local/bin:$HOME/bin:$PATH"
fi
export PATH

# Uncomment the following line if you don't like systemctl's auto-paging feature:
# export SYSTEMD_PAGER=

# User specific aliases and functions
if [ -d ~/.bashrc.d ]; then
        for rc in ~/.bashrc.d/*; do
                if [ -f "$rc" ]; then
                        . "$rc"
                fi
        done
fi

unset rc

# Added to improve history of all commands
shopt -s histappend
export HISTTIMEFORMAT='%F %T '
export HISTSIZE=10000
export HISTFILESIZE=100000
export HISTIGNORE='ls:history'
export HISTCONTROL=ignorespace
export PROMPT_COMMAND='history -a; history -c; history -r'

A view the ‘dsa’ service ID files with some modifications. The default .profile only has the one line that sources the file /opt/CA/Directory/dxserver/install/.dxprofile. To assist with monitoring history, instead of other direct updates, we still will use .bashrc reference to this file.

[dsa@vapp1453 ~]$ cat .profile
. /opt/CA/Directory/dxserver/install/.dxprofile

# Get the aliases and functions
if [ -f ~/.bashrc ]; then
        . ~/.bashrc
fi

Below is the view of the new file .bashrc to be source by DSA .profile file.

[dsa@vapp1453 ~]$ cat .bashrc

# Added to improve history of all commands
shopt -s histappend
export HISTTIMEFORMAT='%F %T '
export HISTSIZE=10000
export HISTFILESIZE=100000
export HISTIGNORE='ls:history'
export HISTCONTROL=ignorespace
export PROMPT_COMMAND='history -a; history -c; history -r'

A view the ‘imps’ service ID files with some modifications. The default .profile only has the one line that sources the file /etc/.profile_imps. To assist with monitoring history, instead of other direct updates, we still will use .bashrc reference to this file

imps@vapp1453 VAPP-14.5.0 (192.168.2.45):~ > cat .profile
# Source IM Provisioning Profile script
. /etc/.profile_imps

# Get the aliases and functions
if [ -f ~/.bashrc ]; then
        . ~/.bashrc
fi

Below is the view of the new file .bashrc to be source by IMPS .profile file.

imps@vapp1453 VAPP-14.5.0 (192.168.2.45):~ > cat .bashrc

# Added to improve history of all commands
shopt -s histappend
export HISTTIMEFORMAT='%F %T '
export HISTSIZE=10000
export HISTFILESIZE=100000
export HISTIGNORE='ls:history'
export HISTCONTROL=ignorespace
export PROMPT_COMMAND='history -a; history -c; history -r'

Delete Sensitive Information from History

If sensitive information has already been recorded in your history, you should clean it up. While you could wipe the entire history, a better approach is to retain as much as possible and remove only the sensitive entries.

The Challenge of Deleting Sensitive History

When deleting specific entries from Bash history, there’s a complication: line numbers change dynamically. The Bash history is a sequential list, so removing an entry causes all subsequent commands to shift up, altering their line numbers.

To address this, the cleanup process should iterate backward through the history. Starting with the last match ensures that earlier line numbers remain unaffected by changes further down the list.

Cleanup Script

Save the following script as history_cleanup.sh and modify the PATTERN variable to match the sensitive commands you want to delete:

#!/bin/bash
##################################################################
#  Name: history_cleanup.sh
#  Goal: Provide a means to clean up prior bash history of any
#  sensitive data by a known pattern, e.g. password or token
# 
# ANA 11/2024
##################################################################
# Prompt the user to enter the pattern to search for
read -p "Enter the pattern to search for in history: " PATTERN

# Validate input
if [ -z "$PATTERN" ]; then
    echo "No pattern entered. Exiting."
    exit 1
fi

# Use grep to find matching history entries and delete them in reverse order
history | grep "$PATTERN" | sort -r | while read -r line; do
    # Extract the history line number (first column in the output)
    LINE_NUMBER=$(echo "$line" | awk '{print $1}')
    
    # Delete the history entry by its line number
    history -d "$LINE_NUMBER"
done

# Save the updated history to the .bash_history file
history -w

echo "History cleanup complete. Entries matching '$PATTERN' have been removed."

Final Thoughts

Applying this .bashrc configuration across all service IDs offers several advantages. It streamlines workflows, secures sensitive inputs, and ensures a more organized command history. These enhancements are particularly valuable for developers, administrators, or anyone operating in multi-terminal environments.

Key Benefits:

  • History Persistence: Ensures commands are appended to the history file without overwriting existing entries, preserving a complete record of activity.
  • Enhanced Auditability: Adds timestamps to history, making it easier to track when specific commands were executed.
  • Reduced Noise: Excludes less critical commands, such as ls, to keep the history clean and focused on meaningful actions.
  • Improved Privacy: Commands starting with a space are omitted from the history, protecting sensitive inputs like passwords or API keys.
  • Real-Time Synchronization: Maintains consistent history across multiple terminal sessions, enabling seamless transitions and collaboration.

By adopting these configurations, you can enhance productivity, improve security, and achieve better management of command history in your environment.

Unleashing Snapshot Reload Magic in WildFly on a Secured Virtual Appliance

broken glass on wooden surface

On a typical Linux host, rolling back a configuration in WildFly can be as simple as copying a backup of the configuration XML file back into place. However, working within the constraints of a secured virtual appliance (vApp) presents a unique challenge: the primary service ID often lacks write access to critical files under the WildFly deployment.

When faced with this limitation, administrators may feel stuck. What options do we have? Thankfully, WildFly’s jboss-cli.sh process provides a lifeline for configuration management, allowing us to take snapshots and reload configurations efficiently. See the bottom of this blog if you need to create a user for jboss-cli.sh usage.

Why Snapshots are necessary for your sanity

WildFly snapshots capture the server’s current configuration, creating a safety net for experimentation and troubleshooting. They allow you to test changes, debug issues, or introduce new features with confidence, knowing you can quickly restore the server to a previous state.

In this guide, we’ll explore a step-by-step process to test and restore configurations using WildFly snapshots on the Symantec IGA Virtual Appliance.

Step-by-Step: Testing and Restoring Configurations

Step 1: Stamp and Backup the Current Configuration

First, optionally you may add a unique custom attribute to the current `standalone.xml` (ca-standalone-full-ha.xml) configuration, if you don’t already have a delta to compare. This new custom attribute acts as a marker, helping track configuration changes. After updating the configuration, take a snapshot.

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:remove()"

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:add(value='v1.0.20241114-Alan-was-here')"

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:read-resource"

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command=":take-snapshot"

Step 2: Modify the Configuration for Testing

Simulate a change by updating the custom attribute. Validate the update with a read query to confirm the changes are applied. To be safe, we will remove the attribute and re-add with a new string that is different.

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:remove()"

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:add(value='v1.0.20241114-Alan-was-here_v2')"

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:read-resource"

Step 3: Review Available Snapshots

List all available snapshots to identify the correct rollback point.
You can use the `:list-snapshots` command to query snapshots and verify files in the snapshot directory.

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command=":list-snapshots"

ls -l /opt/CA/wildfly-idm/standalone/configuration/standalone_xml_history/snapshot/

Step 4: Reload from Snapshot

Once you’ve identified the appropriate snapshot, use the `reload` command to roll back the configuration. This is the
Monitor the process to ensure it completes successfully, then verify the configuration.

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command=":reload(server-config=/standalone_xml_history/snapshot/20241114-232053024ca-standalone-full-ha.xml)"

tail -F /opt/CA/wildfly-idm/standalone/log/wildfly-console.log

/opt/CA/wildfly-idm/bin/jboss-cli.sh --connect --user=jboss-admin --password=Password01! --timeout=90000 --command="/system-property=custom.config.version:read-resource"

Adding a WildFly Admin User for Snapshot Management

Before you can execute commands through WildFly’s `jboss-cli.sh`, you’ll need to ensure you have a properly configured admin user.
If an admin user does not already exist, you can create one with the following command:

sudo /opt/CA/wildfly-idm/bin/add-user.sh -m -u jboss-admin -p Password01! -g SuperUser


- **`-m`**: Indicates the user is for management purposes.
- **`-u jboss-admin`**: Specifies the username (`jboss-admin` in this case).
- **`-p Password01!`**: Sets the password for the user.
- **`-g SuperUser`**: Assigns the user to the `SuperUser` group, granting necessary permissions for snapshot and configuration management.

You can have as many jboss-cli.sh service IDs as you need.

Please note, this Wildfly management service ID is not the same as the Wildfly application service ID, that is needed for the /iam/im/logging_v2.jsp access. Which requires the -a switch and the group of IAMAdmin

sudo /opt/CA/wildfly-idm/bin/add-user.sh -a -u jboss-admin -p Password01! -g  IAMAdmin -r ApplicationRealm

If your logging_v2.jsp page is not displaying correct, there is simple update to resolve this challenge. Add the below string to your /opt/CA/VirtualAppliance/custom/IdentityManager/jvm-args.conf file.

-DLog4jContextSelector=org.apache.logging.log4j.core.selector.BasicContextSelector

Consider the above as good practice before any major update or upgrade. We can work with you to manage your environment.