Completed all white plastic model does not do justice to the details of the plastic models.
Continue readingAuthor Archives: einsamsoldat
Vagrant to orchestrate ubuntu in VirtualBox installing boto3 and Ansible
At the time of this post, the compatibility matrix of vagrant and VirtualBox is as follows:
Vagrant version | VirtualBox version | |
2.3.7 | 7.0.10 | |
7.0.12 |
Unfortunately, Vagrant 2.4.0 does not work well with VirtualBox 7.0
This post was created using Vagrant 2.3.7 and VirtualBox 7.0.10
To make Vagrant possible, after installing the Vagrant from hashicorp webpage, a Vagrant file needs to be created. The most basic file that needs to exist in your Vagrant to work is a folder with Vagrantfile
Additional post start up scripts to complete the installations that are used in this example are setup.sh, install_ansible.sh, install_boto3.sh and install_python3.sh
Continue readingSetting up OpenVPN using Amazon Lightsail
Pre-requisites:
1. Download “openvpn-install.sh” GitHub – angristan/openvpn-install: Set up your own OpenVPN server on Debian, Ubuntu, Fedora, CentOS or Arch Linux.
2. Have Amazon Lightsail activated with quota in AWS account.
* Credits to cyberciti for instructions and the scripts introduction Ubuntu 20.04 LTS Set Up OpenVPN Server In 5 Minutes – nixCraft (cyberciti.biz)
3. SSH keypair are created and added into your AWS account.
4. Make sure OpenVPN client is install on your computer OpenVPN Connect – Client Software For Windows | OpenVPN
Steps:
Creating a Ubuntu AWS instance
1. Click on “Create Instance”
2. Select OS only, and choose your favorite Linux distro, in my case Ubuntu were chosen.
At the time of this blog post created, the openvpn install script works on both ubuntu 22.04 and 20.04
Setting up AWS to use Amazon Lightsail
Requirements:
- Resides in a country that is not sanctioned
- Credit card
- Enabling and Setting up MFA
Problem statement:
The misadventures started when attempting to migrate this blog from webhost to AWS. Amazon Lightsail was used as it was an easier option and it is friendly for the wallet.
Request to have AWS to increase the quota for my AWS account. Started on 31st July 2023. After login into the dashboard of AWS web console, click on your user name on the upper right, then click Service Quotas.
Continue readingApplication beyond the HP way of management by walking around
HP management by walking around is a still relevant concept for front line, middle line and higher management to understand daily operations of the company.
It provides an opportunity for employers and employees to engage better in the workplace.
An unspoken morale of the company and trust level of the company be observed by how willing employees are able to decorate their cubes to feel more at home.
For the initiated, such vibrant and diverse decoration allows personnel in the organization to understand at a more personal level. This has potential to build and promotes trust amongst colleagues at different layers of the organization.
Seasoned corporate citizens could use cues from decorations as an ice breaker. Or finding out persons in organization that may have similar contact with you.
Though one must be cautioned, due to increase heightened sense of privacy and need of personal space, do use wisdom when using such methods.
Diverse or vibrant way of decorating one’s cube are in direct contraction to practitioners of Six Sigma though.
The best wine that I have ever tasted yet
Kiyomai wine is its brand name, the wine type is Tokachi wine. It was brought home by parents and sister after their trip to Hokkaido, Japan.
Bitnami installing let’s encrypt SSL for customed fancy domain
Based on the link https://docs.bitnami.com/aws/how-to/generate-install-lets-encrypt-ssl/
For users with customed domain that is similar to my domain, chow.karmeng.my it is important to chane the –tls switch in the lego command into –http
sudo /opt/bitnami/letsencrypt/lego --http --email="youremail@yourdomain.com" --domains="domain.com" --domains="fancy.domain.com" --path="/opt/bitnami/letsencrypt" run
Working with AWS EC2 burstable Instances
Technically incline folks may choose to use the “free” tier for the first 2 months from the AWS lightsail. Free comes at a cost of burstable CPU consumption. User of lightsail needs to get familiar with the overview of the Average CPU Utilization and Remaining CPU burst capacity as shown in the screenshot below.
Once the AWS EC2 burstable instance started, AWS will add a small amount of burstable credit as long as the EC2 instance CPU usage are not exceeding 10%. If deployment of software into the instances uses CPU exceeds 10%, the CPU burst capacity will be deducted. Remaining CPU burst at 0% capacity will cause capping of the performance for the EC2 instance.
In order for the CPU burst capacity the build to 100%, the EC2 must CPU utilization must not exceed 10%. This will the CPU burst capacity to accumulate to 100% capacity, over the period of 27 hours.
Therefore, before developers or system architects needs to understand this feature of EC2 burstable CPU, allow ample time for the CPU burst capacity to build up. In real world, when deploying applications, the software provisioning may utilized all the CPU burst capacity and causes bad user experience of software performance in the AWS cloud. If I were to be an enterprise solutions architect, I would not use EC2 that has burstable CPU capacity.
Hello AWS Hosted Site!
Welcome to AWS hosted site.
Alot of opportunities to re-discover hands on AWS after many years of inactivity.
EC2, Route53 is all needed thanks to AWS lightsail đ
To prevent system out of memory when calculating Azure Blob Storage using CalculateBlobCost.ps1
A month ago, I was working on migrating a system from Azure to another cloud platform. Azure web management portal at the time of this post has no form of UI to identify the amount of space used by your Blob Storage. One can only know how much he owe Azure from the Billing portal.
Fortunately, Microsoft Azure team has came out with a simple powershell script that will provide better insight of how much blob object and how much space used by your Azure Blob. The solution were provided at Get Billable size for Azure Blob at MSDN.
The script provided works fine, until the script stopped due to System.OutOfMemoryException :
PS C:\dev\powershell> .\CalculateBlobCost.ps1 -storageaccountname "myProdatAmerNorth" VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Automation\Microsoft.Azure.Commands.Automation.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\TrafficManager\Microsoft.WindowsAzure.Commands.TrafficManager.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Profile.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Compute\Microsoft.WindowsAzure.Commands.ServiceManagement.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Storage\Microsoft.WindowsAzure.Commands.Storage.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\ManagedCache\Microsoft.Azure.Commands.ManagedCache.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\HDInsight\Microsoft.WindowsAzure.Commands.HDInsight.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Networking\Microsoft.WindowsAzure.Commands.ServiceManagement.Network.dl l'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\StorSimple\Microsoft.WindowsAzure.Commands.StorSimple.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RemoteApp\Microsoft.WindowsAzure.Commands.RemoteApp.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RecoveryServices\Microsoft.Azure.Commands.RecoveryServices.dll'. VERBOSE: 11:35:12 AM - Begin Operation: Get-AzureStorageAccount VERBOSE: 11:35:14 AM - Completed Operation: Get-AzureStorageAccount WARNING: GeoReplicationEnabled property will be deprecated in a future release of Azure PowerShell. The value will be merged into the AccountType property. VERBOSE: 11:35:14 AM - Begin Operation: Get-AzureStorageKey VERBOSE: 11:35:16 AM - Completed Operation: Get-AzureStorageKey VERBOSE: Container 'mySettings' with 5 blobs has a size of 0.00MB. Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:124 char:13 + $containerSizeInBytes += Get-BlobBytes $_ + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException ForEach-Object : Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:123 char:9 + ForEach-Object { + ~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [ForEach-Object], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ForEachObjectCommand VERBOSE: Container 'whoamI-version-2012' with 79895 blobs has a size of 33.56MB. VERBOSE: Container 'whoamI-version-2' with 0 blobs has a size of 0.00MB. VERBOSE: Container 'whoamI-version-3' with 0 blobs has a size of 0.00MB. Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException VERBOSE: Container 'whoamI-version-2' with 10070 blobs has a size of 7.29MB. VERBOSE: Container 'whoamI-version-3' with 423186 blobs has a size of 1337.75MB. VERBOSE: Container 'whoamI-version-4' with 95 blobs has a size of 95.96MB.
After few hours later, identify that one of my Azure Blob container “whoamI-version-3” has more than 1 Million object, with size exceeding 1GB was the culprit that caused the script to fail. The cmdlet that failed was Get-AzureStorageBlob.
From, MSDNÂ Get-AzureStorageBlob has the parameter -MaxCount that would be able to calculate the object in the AzureBlob in batches; to SQL paging.
By adding the use of MaxCount parameter and adding the logic to have a continuation token, the improved script is now able to run without any fear of OutOfMemoryException.
To immediately use the improved powershell script, kindly download the file attached in this post.CalculateBlobCost powershell script