Making your windows RDP accessible from your home network

It is simple to make your computer in your home LAN accessible from internet.

The pre-requisite would be understanding how Network Address Translation works. To make the dynamic IP of ISP provide to your home internet access, feel free to research into finding Dynamic DNS providers. This will make it easier than using websites like whatismyip.com . And you just need to fire up the RDP client, and use the Dynamic DNS FQDN which will be always pointing to your static IP.

Depending on the network equipment used in your home or provided by your Internet Service Provider (ISP), you may need to look for NAT or in my case IPv4 port mapping.

The screenshot above, shows how to create a NAT mapping or a IPv4 port mapping for a computer with LAN IP, 192.168.100.12, take note that the port of RDP is 3389 and the protocol used is TCP. Hence, the internal port must be numbered 3389.

External port set to 3389 are for the sake of simplicity.

If there are more host in your LAN need to be accessible from the internet remotely using RDP, there is no need to change the default port of RDP of your host.

Instead, use the known port that is not blocked by your ISP. Example assuming that higher port numbers are not blocked by your ISP, you could use 13389, or 23389, or 33389 or 43389, or 53389 or 63389, as long as it is not more than 65535 or lesser than 1024. Those port number can be used as the External port number, while maintaining the internal port number to 3389.

Setting up custom domain in Zoho mail for free

From the previous post, this is definitely a follow up to enable my vanity email.

I was recommended by a friend to try our Zoho mail as it is free and offer use of customed domains.

Here are the results and how to get started.

First sign up a free plan with Zoho at Zoho Mail Pricing and Editions – Free for 5 Users

Do not panic, scroll down to view the free plan sign up.

Continue reading

DNS routing traffic for subdomains using AWS Route 53

Long explanation in AWS Routing traffic for subdomains – Amazon Route 53

GROUND RULE or RULE OF THUMB of DNS, NEVER change the NS record of your domain in your Domain registrar before completing setup of your DNS server. The result may cause 72 hours of outtage.

Using my domain karmeng.my as an example, the objective is to have chow.karmeng.my to be a valid domain for email hosting which offers customed domain.

Continue reading

Vagrant to orchestrate ubuntu in VirtualBox installing boto3 and Ansible

At the time of this post, the compatibility matrix of vagrant and VirtualBox is as follows:

Vagrant versionVirtualBox version
2.3.77.0.10
7.0.12
Vagrant and Virtualbox compatibility matrix

Unfortunately, Vagrant 2.4.0 does not work well with VirtualBox 7.0

This post was created using Vagrant 2.3.7 and VirtualBox 7.0.10

To make Vagrant possible, after installing the Vagrant from hashicorp webpage, a Vagrant file needs to be created. The most basic file that needs to exist in your Vagrant to work is a folder with Vagrantfile

Additional post start up scripts to complete the installations that are used in this example are setup.sh, install_ansible.sh, install_boto3.sh and install_python3.sh

Continue reading

Setting up OpenVPN using Amazon Lightsail

Pre-requisites:
1. Download “openvpn-install.sh” GitHub – angristan/openvpn-install: Set up your own OpenVPN server on Debian, Ubuntu, Fedora, CentOS or Arch Linux.
2. Have Amazon Lightsail activated with quota in AWS account.
* Credits to cyberciti for instructions and the scripts introduction Ubuntu 20.04 LTS Set Up OpenVPN Server In 5 Minutes – nixCraft (cyberciti.biz)
3. SSH keypair are created and added into your AWS account.
4. Make sure OpenVPN client is install on your computer OpenVPN Connect – Client Software For Windows | OpenVPN

Steps:
Creating a Ubuntu AWS instance
1. Click on “Create Instance”

2. Select OS only, and choose your favorite Linux distro, in my case Ubuntu were chosen.
At the time of this blog post created, the openvpn install script works on both ubuntu 22.04 and 20.04

Continue reading

Setting up AWS to use Amazon Lightsail

Requirements:

  1. Resides in a country that is not sanctioned
  2. Credit card
  3. Enabling and Setting up MFA

Problem statement:

The misadventures started when attempting to migrate this blog from webhost to AWS. Amazon Lightsail was used as it was an easier option and it is friendly for the wallet.

Request to have AWS to increase the quota for my AWS account. Started on 31st July 2023. After login into the dashboard of AWS web console, click on your user name on the upper right, then click Service Quotas.

Continue reading

Bitnami installing let’s encrypt SSL for customed fancy domain

Based on the link https://docs.bitnami.com/aws/how-to/generate-install-lets-encrypt-ssl/

For users with customed domain that is similar to my domain, chow.karmeng.my it is important to chane the –tls switch in the lego command into –http

sudo /opt/bitnami/letsencrypt/lego --http --email="youremail@yourdomain.com" --domains="domain.com" --domains="fancy.domain.com" --path="/opt/bitnami/letsencrypt" run

Working with AWS EC2 burstable Instances

Technically incline folks may choose to use the “free” tier for the first 2 months from the AWS lightsail. Free comes at a cost of burstable CPU consumption. User of lightsail needs to get familiar with the overview of the Average CPU Utilization and Remaining CPU burst capacity as shown in the screenshot below.

Once the AWS EC2 burstable instance started, AWS will add a small amount of burstable credit as long as the EC2 instance CPU usage are not exceeding 10%. If deployment of software into the instances uses CPU exceeds 10%, the CPU burst capacity will be deducted. Remaining CPU burst at 0% capacity will cause capping of the performance for the EC2 instance.

In order for the CPU burst capacity the build to 100%, the EC2 must CPU utilization must not exceed 10%. This will the CPU burst capacity to accumulate to 100% capacity, over the period of 27 hours.

Therefore, before developers or system architects needs to understand this feature of EC2 burstable CPU, allow ample time for the CPU burst capacity to build up. In real world, when deploying applications, the software provisioning may utilized all the CPU burst capacity and causes bad user experience of software performance in the AWS cloud. If I were to be an enterprise solutions architect, I would not use EC2 that has burstable CPU capacity.

To prevent system out of memory when calculating Azure Blob Storage using CalculateBlobCost.ps1

A month ago, I was working on migrating a system from Azure to another cloud platform. Azure web management portal at the time of this post has no form of UI to identify the amount of space used by your Blob Storage. One can only know how much he owe Azure from the Billing portal.

Fortunately, Microsoft Azure team has came out with a simple powershell script that will provide better insight of how much blob object and how much space used by your Azure Blob. The solution were provided at Get Billable size for Azure Blob at MSDN.

The script provided works fine, until the script stopped due to System.OutOfMemoryException :

PS C:\dev\powershell> .\CalculateBlobCost.ps1 -storageaccountname "myProdatAmerNorth"
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Automation\Microsoft.Azure.Commands.Automation.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\TrafficManager\Microsoft.WindowsAzure.Commands.TrafficManager.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Profile.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Compute\Microsoft.WindowsAzure.Commands.ServiceManagement.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Storage\Microsoft.WindowsAzure.Commands.Storage.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\ManagedCache\Microsoft.Azure.Commands.ManagedCache.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\HDInsight\Microsoft.WindowsAzure.Commands.HDInsight.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Networking\Microsoft.WindowsAzure.Commands.ServiceManagement.Network.dl
l'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\StorSimple\Microsoft.WindowsAzure.Commands.StorSimple.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RemoteApp\Microsoft.WindowsAzure.Commands.RemoteApp.dll'.
VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft
SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RecoveryServices\Microsoft.Azure.Commands.RecoveryServices.dll'.
VERBOSE: 11:35:12 AM - Begin Operation: Get-AzureStorageAccount
VERBOSE: 11:35:14 AM - Completed Operation: Get-AzureStorageAccount
WARNING: GeoReplicationEnabled property will be deprecated in a future release of Azure PowerShell. The value will be
merged into the AccountType property.
VERBOSE: 11:35:14 AM - Begin Operation: Get-AzureStorageKey
VERBOSE: 11:35:16 AM - Completed Operation: Get-AzureStorageKey
VERBOSE: Container 'mySettings' with 5 blobs has a size of 0.00MB.
Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was
thrown."
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [], MethodInvocationException
+ FullyQualifiedErrorId : StorageException

Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:124 char:13
+ $containerSizeInBytes += Get-BlobBytes $_
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

ForEach-Object : Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:123 char:9
+ ForEach-Object {
+ ~~~~~~~~~~~~~~~~
+ CategoryInfo : NotSpecified: (:) [ForEach-Object], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ForEachObjectCommand

VERBOSE: Container 'whoamI-version-2012' with 79895 blobs has a size of 33.56MB.
VERBOSE: Container 'whoamI-version-2' with 0 blobs has a size of 0.00MB.
VERBOSE: Container 'whoamI-version-3' with 0 blobs has a size of 0.00MB.
Exception of type 'System.OutOfMemoryException' was thrown.
At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9
+ $Blob.ICloudBlob.DownloadBlockList() |
+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : OperationStopped: (:) [], OutOfMemoryException
+ FullyQualifiedErrorId : System.OutOfMemoryException

VERBOSE: Container 'whoamI-version-2' with 10070 blobs has a size of 7.29MB.
VERBOSE: Container 'whoamI-version-3' with 423186 blobs has a size of 1337.75MB.
VERBOSE: Container 'whoamI-version-4' with 95 blobs has a size of 95.96MB.

After few hours later, identify that one of my Azure Blob container “whoamI-version-3” has more than 1 Million object, with size exceeding 1GB was the culprit that caused the script to fail. The cmdlet that failed was Get-AzureStorageBlob.

From, MSDN Get-AzureStorageBlob has the parameter -MaxCount that would be able to calculate the object in the AzureBlob in batches; to SQL paging.

By adding the use of MaxCount parameter and adding the logic to have a continuation token, the improved script is now able to run without any fear of OutOfMemoryException.

To immediately use the improved powershell script, kindly download the file attached in this post.CalculateBlobCost powershell script

how to install mysql proxy quick and easy

In the linux ubuntu or redhat/centos (yum) just use the default package manager. For my post I am using ubuntu as an example to illustrate my point.

sudo apt-get install mysql-proxy

You should be getting the following confirmation

Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
ttf-dejavu-extra libevent-extra-2.0-5 libmysql++3 libdbi1 libapr1 librrd4 libcairo2 libmysqlclient-dev libaprutil1-ldap libthai-data libreadline6-dev libpcrecpp0 libsvn1 libdatrie1
fontconfig libtinfo-dev libpixman-1-0 libevent-openssl-2.0-5 libaprutil1-dbd-sqlite3 libonig2 libthai0 libneon27-gnutls zlib1g-dev libevent-pthreads-2.0-5 libzzip-0-13 ttf-dejavu
libdb4.8 libpq5 libpcre3-dev libpango1.0-0 libxcb-render0 libxcb-shm0 libaprutil1 libevent-core-2.0-5 libfcgi0ldbl libreadline-dev

However, to run the mysql-proxy there is a few tweak needed, assuming that you are not using mysql-proxy beyond just a proxy.

To start up mysql-proxy, you may issue the command such as this :
sudo mysql-proxy --defaults-file=/etc/mysql/mysql_proxy.cnf &

The configuration of mysql_proxy.cnf :

log-file = /opt/apps/logs/mysql-proxy/mysql-proxy.log
log-level = debug
proxy-backend-addresses = 10.161.89.64:3306
admin-username = root
admin-password = for_my_eyes_only

If the mysql-proxy fails to start as a daemon, it is best to check the logs at /opt/apps/logs/mysql-proxy/mysql-proxy.log :

root@ckm-myprox:/opt/apps/logs/mysql-proxy$ sudo tail -f mysql-proxy.log
2014-06-11 20:58:27: (message) mysql-proxy 0.8.1 started
2014-06-11 20:58:27: (debug) max open file-descriptors = 1024
2014-06-11 20:58:27: (critical) admin-plugin.c:579: --admin-lua-script needs to be set, /lib/mysql-proxy/lua/admin.lua may be a good value
2014-06-11 20:58:27: (critical) mainloop.c:267: applying config of plugin admin failed
2014-06-11 20:58:27: (critical) mysql-proxy-cli.c:596: Failure from chassis_mainloop. Shutting down.
2014-06-11 20:58:27: (message) Initiating shutdown, requested from mysql-proxy-cli.c:597
2014-06-11 20:58:27: (message) shutting down normally, exit code is: 1

The last line of the log, just confirmed the mysql-proxy were failed to start due admin lua were not set. To skip the admin lua function(assuming that it will not be used). Start the mysql with :
sudo mysql-proxy --defaults-file=/etc/mysql/mysql_proxy.cnf --plugins=proxy &

Take note that my post on installing mysql-proxy, the version is 0.8.4 and the stock from ubuntu repository is 0.8.1 .