
Kiyomai wine is its brand name, the wine type is Tokachi wine. It was brought home by parents and sister after their trip to Hokkaido, Japan.
Kiyomai wine is its brand name, the wine type is Tokachi wine. It was brought home by parents and sister after their trip to Hokkaido, Japan.
Based on the link https://docs.bitnami.com/aws/how-to/generate-install-lets-encrypt-ssl/
For users with customed domain that is similar to my domain, chow.karmeng.my it is important to chane the –tls switch in the lego command into –http
sudo /opt/bitnami/letsencrypt/lego --http --email="youremail@yourdomain.com" --domains="domain.com" --domains="fancy.domain.com" --path="/opt/bitnami/letsencrypt" run
Technically incline folks may choose to use the “free” tier for the first 2 months from the AWS lightsail. Free comes at a cost of burstable CPU consumption. User of lightsail needs to get familiar with the overview of the Average CPU Utilization and Remaining CPU burst capacity as shown in the screenshot below.
Once the AWS EC2 burstable instance started, AWS will add a small amount of burstable credit as long as the EC2 instance CPU usage are not exceeding 10%. If deployment of software into the instances uses CPU exceeds 10%, the CPU burst capacity will be deducted. Remaining CPU burst at 0% capacity will cause capping of the performance for the EC2 instance.
In order for the CPU burst capacity the build to 100%, the EC2 must CPU utilization must not exceed 10%. This will the CPU burst capacity to accumulate to 100% capacity, over the period of 27 hours.
Therefore, before developers or system architects needs to understand this feature of EC2 burstable CPU, allow ample time for the CPU burst capacity to build up. In real world, when deploying applications, the software provisioning may utilized all the CPU burst capacity and causes bad user experience of software performance in the AWS cloud. If I were to be an enterprise solutions architect, I would not use EC2 that has burstable CPU capacity.
Welcome to AWS hosted site.
A month ago, I was working on migrating a system from Azure to another cloud platform. Azure web management portal at the time of this post has no form of UI to identify the amount of space used by your Blob Storage. One can only know how much he owe Azure from the Billing portal.
Fortunately, Microsoft Azure team has came out with a simple powershell script that will provide better insight of how much blob object and how much space used by your Azure Blob. The solution were provided at Get Billable size for Azure Blob at MSDN.
The script provided works fine, until the script stopped due to System.OutOfMemoryException :
PS C:\dev\powershell> .\CalculateBlobCost.ps1 -storageaccountname "myProdatAmerNorth" VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Automation\Microsoft.Azure.Commands.Automation.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\TrafficManager\Microsoft.WindowsAzure.Commands.TrafficManager.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Services\Microsoft.WindowsAzure.Commands.Profile.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Compute\Microsoft.WindowsAzure.Commands.ServiceManagement.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Sql\Microsoft.WindowsAzure.Commands.SqlDatabase.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Storage\Microsoft.WindowsAzure.Commands.Storage.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\ManagedCache\Microsoft.Azure.Commands.ManagedCache.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\HDInsight\Microsoft.WindowsAzure.Commands.HDInsight.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\Networking\Microsoft.WindowsAzure.Commands.ServiceManagement.Network.dl l'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\StorSimple\Microsoft.WindowsAzure.Commands.StorSimple.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RemoteApp\Microsoft.WindowsAzure.Commands.RemoteApp.dll'. VERBOSE: Loading module from path 'C:\Program Files (x86)\Microsoft SDKs\Azure\PowerShell\ServiceManagement\Azure\.\RecoveryServices\Microsoft.Azure.Commands.RecoveryServices.dll'. VERBOSE: 11:35:12 AM - Begin Operation: Get-AzureStorageAccount VERBOSE: 11:35:14 AM - Completed Operation: Get-AzureStorageAccount WARNING: GeoReplicationEnabled property will be deprecated in a future release of Azure PowerShell. The value will be merged into the AccountType property. VERBOSE: 11:35:14 AM - Begin Operation: Get-AzureStorageKey VERBOSE: 11:35:16 AM - Completed Operation: Get-AzureStorageKey VERBOSE: Container 'mySettings' with 5 blobs has a size of 0.00MB. Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception calling "DownloadBlockList" with "0" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown." At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [], MethodInvocationException + FullyQualifiedErrorId : StorageException Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:124 char:13 + $containerSizeInBytes += Get-BlobBytes $_ + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException ForEach-Object : Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:123 char:9 + ForEach-Object { + ~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [ForEach-Object], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException,Microsoft.PowerShell.Commands.ForEachObjectCommand VERBOSE: Container 'whoamI-version-2012' with 79895 blobs has a size of 33.56MB. VERBOSE: Container 'whoamI-version-2' with 0 blobs has a size of 0.00MB. VERBOSE: Container 'whoamI-version-3' with 0 blobs has a size of 0.00MB. Exception of type 'System.OutOfMemoryException' was thrown. At C:\dev\powershell\CalculateBlobCost.ps1:77 char:9 + $Blob.ICloudBlob.DownloadBlockList() | + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : OperationStopped: (:) [], OutOfMemoryException + FullyQualifiedErrorId : System.OutOfMemoryException VERBOSE: Container 'whoamI-version-2' with 10070 blobs has a size of 7.29MB. VERBOSE: Container 'whoamI-version-3' with 423186 blobs has a size of 1337.75MB. VERBOSE: Container 'whoamI-version-4' with 95 blobs has a size of 95.96MB.
After few hours later, identify that one of my Azure Blob container “whoamI-version-3” has more than 1 Million object, with size exceeding 1GB was the culprit that caused the script to fail. The cmdlet that failed was Get-AzureStorageBlob.
From, MSDN Get-AzureStorageBlob has the parameter -MaxCount that would be able to calculate the object in the AzureBlob in batches; to SQL paging.
By adding the use of MaxCount parameter and adding the logic to have a continuation token, the improved script is now able to run without any fear of OutOfMemoryException.
To immediately use the improved powershell script, kindly download the file attached in this post.CalculateBlobCost powershell script
In the linux ubuntu or redhat/centos (yum) just use the default package manager. For my post I am using ubuntu as an example to illustrate my point.
sudo apt-get install mysql-proxy
You should be getting the following confirmation
Reading package lists... Done
Building dependency tree
Reading state information... Done
The following packages were automatically installed and are no longer required:
ttf-dejavu-extra libevent-extra-2.0-5 libmysql++3 libdbi1 libapr1 librrd4 libcairo2 libmysqlclient-dev libaprutil1-ldap libthai-data libreadline6-dev libpcrecpp0 libsvn1 libdatrie1
fontconfig libtinfo-dev libpixman-1-0 libevent-openssl-2.0-5 libaprutil1-dbd-sqlite3 libonig2 libthai0 libneon27-gnutls zlib1g-dev libevent-pthreads-2.0-5 libzzip-0-13 ttf-dejavu
libdb4.8 libpq5 libpcre3-dev libpango1.0-0 libxcb-render0 libxcb-shm0 libaprutil1 libevent-core-2.0-5 libfcgi0ldbl libreadline-dev
However, to run the mysql-proxy there is a few tweak needed, assuming that you are not using mysql-proxy beyond just a proxy.
To start up mysql-proxy, you may issue the command such as this :
sudo mysql-proxy --defaults-file=/etc/mysql/mysql_proxy.cnf &
The configuration of mysql_proxy.cnf :
log-file = /opt/apps/logs/mysql-proxy/mysql-proxy.log
log-level = debug
proxy-backend-addresses = 10.161.89.64:3306
admin-username = root
admin-password = for_my_eyes_only
If the mysql-proxy fails to start as a daemon, it is best to check the logs at /opt/apps/logs/mysql-proxy/mysql-proxy.log :
root@ckm-myprox:/opt/apps/logs/mysql-proxy$ sudo tail -f mysql-proxy.log
2014-06-11 20:58:27: (message) mysql-proxy 0.8.1 started
2014-06-11 20:58:27: (debug) max open file-descriptors = 1024
2014-06-11 20:58:27: (critical) admin-plugin.c:579: --admin-lua-script needs to be set, /lib/mysql-proxy/lua/admin.lua may be a good value
2014-06-11 20:58:27: (critical) mainloop.c:267: applying config of plugin admin failed
2014-06-11 20:58:27: (critical) mysql-proxy-cli.c:596: Failure from chassis_mainloop. Shutting down.
2014-06-11 20:58:27: (message) Initiating shutdown, requested from mysql-proxy-cli.c:597
2014-06-11 20:58:27: (message) shutting down normally, exit code is: 1
The last line of the log, just confirmed the mysql-proxy were failed to start due admin lua were not set. To skip the admin lua function(assuming that it will not be used). Start the mysql with :
sudo mysql-proxy --defaults-file=/etc/mysql/mysql_proxy.cnf --plugins=proxy &
Take note that my post on installing mysql-proxy, the version is 0.8.4 and the stock from ubuntu repository is 0.8.1 .
sudo wget http://dev.mysql.com/get/Downloads/MySQL-Proxy/mysql-proxy-0.8.4.tar.gz
sudo tar -xzvf mysql-proxy-0.8.4.tar.gz
sudo apt-get install libmysql++-dev
sudo ./configure
configure: error: The pkg-config script could not be found or is too old. Make sure it is in your PATH or set the PKG_CONFIG environment variable to the full path to pkg-config.
sudo apt-get install pkg-config
checking pkg-config is at least version 0.9.0... yes checking for LUA... no ... checked for Lua via pkg-config: No package 'lua' found. retrying with lua5.1 checking for LUA... no configure: error: checked for Lua via pkg-config: No package 'lua5.1' found. Make sure lua and its devel-package, which includes the lua5.1.pc (debian and friends) or lua.pc (all others) file, is installed
sudo apt-get install lua5.1
sudo apt-get install liblua5.1 sudo apt-get install liblua5.1-sql-mysql2
checking pkg-config is at least version 0.9.0... yes checking for LUA... no ... checked for Lua via pkg-config: No package 'lua' found. retrying with lua5.1 checking for LUA... yes checking for GLIB... configure: error: Package requirements (glib-2.0 >= 2.16.0) were not met:
sudo apt-get install glib2.0 sudo apt-get install libglib2.0-0
configure: error: libevent is required
sudo apt-get install libevent-2.0-5 sudo apt-get install libevent-dev
sudo make
sudo make install
sudo mysql-proxy
mysql-proxy: error while loading shared libraries: libmysql-chassis.so.0: cannot open shared object file: No such file or directory
sudo ldconfig
Usage:
mysql-proxy [OPTION...] - MySQL Proxy
Help Options:
-h, --help Show help options
--help-all Show all help options
--help-proxy Show options for the proxy-module
Application Options:
-V, --version Show version
--defaults-file= configuration file
--verbose-shutdown Always log the exit code when shutting down
--daemon Start in daemon-mode
--user= Run mysql-proxy as user
--basedir= Base directory to prepend to relative paths in the config
--pid-file= PID file in case we are started as daemon
--plugin-dir=
path to the plugins
--plugins= plugins to load
--log-level=(error|warning|info|message|debug) log all messages of level ... or higher
--log-file= log all messages in a file
--log-use-syslog log all messages to syslog
--log-backtrace-on-crash try to invoke debugger on crash
--keepalive try to restart the proxy if it crashed
--max-open-files maximum number of open files (ulimit -n)
--event-threads number of event-handling threads (default: 1)
--lua-path= set the LUA_PATH
--lua-cpath= set the LUA_CPATH
The Covenant of Primus is collectors’ collectible items for Transformers fans. The book is house in the Autobot insignia or shield. What is amazing, even the book casing is meticulously stored in a well package cardboard. designed to prevent damage to the cover that is housing the book case.
Good new for AWS users, Amazon has release the new unified AWSCli was released in September 2013. Amazon did provide multiple ways to have the new AWSCli installed.
I have to admit the task of installation is more straight forward and simplified compared to the old AWS Cli.
At the time of this post, the AWSCli released version 1.2.6, and it runs on Python 2.6. Hence, this post will provide custom install of AWSCli into linux based machine.
For users who are planning to use the AWSCli bundle provided by Amazon here is the recommended steps in sequence. Disclaimer and note : I have not added any form of error catching or linux distro detection and I am assuming the linux distro used is redhat.
Installing AWSCli using the Amazon awscli-bundle
mkdir -p /opt/apps/tmp cd /opt/apps/tmp wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip unzip awscli-bundle.zip mkdir -p /opt/apps/$(ls awscli-bundle/packages/ | egrep -o 'awscli-[0-9]\.[0-9]\.[0-9]') ./awscli-bundle/install -i /opt/apps/$(ls awscli-bundle/packages/ | egrep -o 'awscli-[0-9]\.[0-9]\.[0-9]') /opt/apps/awscli/bin/aws --version ln -s /opt/apps/$(ls awscli-bundle/packages/ | egrep -o 'awscli-[0-9]\.[0-9]\.[0-9]') /opt/apps/awscli ln -s /opt/apps/awscli/bin/aws /usr/bin/aws ln -s /opt/apps/awscli/bin/aws.cmd /usr/bin/aws.cmd cd ~ rm -Rf /opt/apps/tmp
Installing AWS via pip
python --version apt-get install python-pip yum install python-pip cd /opt/apps/ mkdir tmp cd tmp wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py wget https://raw.github.com/pypa/pip/master/contrib/get-pip.py python ez_setup.py python get-pip.py pip install awscli==1.2.6 aws help cd ~ rm -Rf /opt/apps/tmp
The advantage of AWSCli bundle over the pip method is, ease of install without need to get ez_setup and pip installed. Since the AWSCli bundle zip is hosted within Amazon Web Service network, it took me less than 1 seconds to download the 5MB AWSCli-bundle.zip file.
[root@ip-10-255-255-1 ~]# time wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip --2013-11-28 05:25:16-- https://s3.amazonaws.com/aws-cli/awscli-bundle.zip Resolving s3.amazonaws.com... 176.32.99.46 Connecting to s3.amazonaws.com|176.32.99.46|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 5139105 (4.9M) [application/zip] Saving to: `awscli-bundle.zip' 100%[================================================================================================================ ===================================================================================>] 5,139,105 16.0M/s in 0.3s 2013-11-28 05:25:17 (16.0 MB/s) - `awscli-bundle.zip' saved [5139105/5139105] real 0m0.611s user 0m0.096s sys 0m0.036s [root@ip-10-255-255-1 ~]# time wget https://s3.amazonaws.com/aws-cli/awscli-bundle.zip --2013-11-28 05:25:49-- https://s3.amazonaws.com/aws-cli/awscli-bundle.zip Resolving s3.amazonaws.com... 176.32.99.46 Connecting to s3.amazonaws.com|176.32.99.46|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 5139105 (4.9M) [application/zip] Saving to: `awscli-bundle.zip' 100%[================================================================================================================ ===================================================================================>] 5,139,105 19.6M/s in 0.2s 2013-11-28 05:25:49 (19.6 MB/s) - `awscli-bundle.zip' saved [5139105/5139105] real 0m0.338s user 0m0.076s sys 0m0.036s
The downside of using the awscli bundle installation is users need to upload the awscli-bundle.zip into personal version control servers/services (such as github) in order to have version control of awscli. Therefore, there will be overhead of maintaining the version of the awscli and it will be labor intensive or makes processes complicated.
The only disadvantage of pip AWSCli is the pre-requisite of installing pip. And maybe in the future, would be a permanent removal of older awscli version from the pip public repository.
Administrator using pip will be able to make awscli to be installed into a customed directory such as /opt/apps by using the following pip command
pip install --install-option="--prefix=/opt/apps" awscli==1.2.6
Unfortunately, in doing so pip will no longer able to manage the awscli package. Administrators will need to have a small effort to remove the installed version manually before upgrading the AWSCli using the similar command.
As a closing, to my personal opinion pip is a better way to install and maintaining the version of AWSCli.
@echo off
powershell -version 2.0 -ExecutionPolicy unrestricted %~dp0generate_putty_sessions.ps1
regedit.exe /s %userprofile%\putty_list.reg
#Preloading scripts
#Removing old reg file
if ( Test-Path $env:userprofile\putty_list.reg){
del $env:userprofile\putty_list.reg
}
#Check environment for Windows x86 or x86_64
if ([IntPtr]::Size -eq 4){
if ( Test-Path "C:\Program Files\AWS Tools\PowerShell\AWSPowerShell"){
import-module "C:\Program Files\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
}
else{
write-host "AWS Tools for PowerShell was not install, exiting. Download at http://aws.amazon.com/powershell/"
exit
}
}
else{
if ( Test-Path "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell"){
import-module "C:\Program Files (x86)\AWS Tools\PowerShell\AWSPowerShell\AWSPowerShell.psd1"
}
else{
write-host "AWS Tools for PowerShell was not install, exiting. Download at http://aws.amazon.com/powershell/"
exit
}
}
#Check env variable for required EC2 configuration
if (-not (Test-Path Env:\EC2_HOME)){
write-host "Environment variable EC2_HOME was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
exit
}
if (-not (Test-Path Env:\EC2_CERT)){
write-host "Environment variable EC2_CERT was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
exit
}
if (-not (Test-Path Env:\EC2_PRIVATE_KEY)){
write-host "Environment variable EC2_PRIVATE_KEY was not found, please ensure your EC2 API Tools were properly installed or configured or setup."
exit
}
#Get my script path
$myPath = split-path -parent $MyInvocation.MyCommand.Definition
if(-not (Test-Path -path $myPath\reg_header.txt)){
write-host "Please make sure reg_header.txt is in " $myPath
exit
}
if(-not (Test-Path -path $myPath\reg_putty.txt)){
write-host "Please make sure reg_header.txt is in " $myPath
exit
}
Copy-Item $myPath\reg_header.txt $env:userprofile
Copy-Item $myPath\reg_putty.txt $env:userprofile
#Main body and function of the script.
#Creating file to link instance ID with Public DNS
ec2-describe-instances --filter `"virtualization-type=paravirtual`" --filter `"instance-state-name=running`" --filter `"tag:Name=/*/*`" | Select-String -pattern INSTANCE -caseSensitive | foreach { "$($_.ToString().split()[1,3])" >> $env:userprofile\awsinstanceIP.tmp}
#Creating a file to link instance ID with Name tag
ec2-describe-instances --filter `"virtualization-type=paravirtual`" --filter `"instance-state-name=running`" --filter `"tag:Name=/*/*`" | Select-String -pattern Name -caseSensitive | foreach { "$($_.ToString().split()[2,4])" >> $env:userprofile\awsinstanceName.tmp}
# Clean up results, removing RenderWorkerGroup
Get-Content $env:userprofile\awsinstanceName.tmp | Select-String -pattern RenderWorkerGroup -NotMatch | foreach { "$($_.ToString().split()[0,1])" >> $env:userprofile\awsinstanceNameClean.tmp}
#$awsInstanceIDIP = Get-Content $env:userprofile\awsinstanceIP.tmp
$awsInstanceCleanName = Get-Content $env:userprofile\awsinstanceNameClean.tmp
$count = 0
# Create HashTable from File.
ForEach ($line in $awsInstanceCleanName) {
if ($count -le 0 ) {
$myHash = @{ $line.ToString().Split()[0] = $line.ToString().Split()[1]}
}
else{
$myHash.Set_Item($line.ToString().Split()[0], $line.ToString().Split()[1])
}
$count = $count + 1
}
$count = 0
Get-Content $env:userprofile\awsinstanceIP.tmp | ForEach-Object {
$line = $_
$myHash.GetEnumerator() | ForEach-Object {
if ($line -match $_.Key)
{
if ($_.value.ToString().Contains("render-worker")){
$replacement = $_.Key.ToString() + " " + $_.Value.ToString() + "/" + $_.Key.ToString()
}
else{
$replacement = $_.Key.ToString() + " " + $_.Value.ToString()
}
$line = $line -replace $_.Key, $replacement
}
}
$line
} | Set-Content -Path $env:userprofile\awsinstanceResult.tmp
del $env:userprofile\awsinstanceIP.tmp
del $env:userprofile\awsinstanceName.tmp
del $env:userprofile\awsinstanceNameClean.tmp
$awsinstanceResult = Get-Content $env:userprofile\awsinstanceResult.tmp
#Adding header into file content.
Add-Content $env:userprofile\awsinstanceReg_List.tmp $(Get-Content $env:userprofile\reg_header.txt)
Add-Content $env:userprofile\awsinstanceReg_List.tmp "`r"
# Populating body of the file before converting into registry file.
foreach ($line in $awsinstanceResult){
$reg_line = "`[HKEY_CURRENT_USER\Software\Simontatham\PuTTY\Sessions\" + $line.ToString().Split()[1] + "]"
Add-Content $env:userprofile\awsinstanceReg_List.tmp $reg_line
$reg_line = "`"HostName`"=`"" + $line.ToString().Split()[2] + "`""
Add-Content $env:userprofile\awsinstanceReg_List.tmp $reg_line
# Add fillers to the sessions
Add-Content $env:userprofile\awsinstanceReg_List.tmp $(Get-Content $env:userprofile\reg_putty.txt)
Add-Content $env:userprofile\awsinstanceReg_List.tmp "`r"
}
Get-Content $env:userprofile\awsinstanceReg_List.tmp | Add-Content $env:userprofile\putty_list.reg
#Removing all temporary files
del $env:userprofile\awsinstanceReg_List.tmp
del $env:userprofile\awsinstanceResult.tmp
del $env:userprofile\reg_header.txt
del $env:userprofile\reg_putty.txt