PowerShell AWS CLI and json
When you work with Amazon cloud, you often have to perform many routine tasks through the Web console. But I want to automate.
AWS CLI, command line interface, well-suited for this. Of course, you can write the application in Scala, but in everyday tasks better oboitis without "heavy artillery".
Team AWS can return data in different formats, including json. You can use bash and jq, but the latter is not in the cygwin repository, and install laziness. Meanwhile, in PowerShell has great support for json! The truth was that to use this is not very simple.
the
The fact that json is provided in the form of several lines, and is treated as an array. Feed it in PowerShell in different ways:
the
To get the right information now not difficult:
the
In simple cases, the extraction of the information may be turned over to aws:
the
or even
the
Thus, observing the strange behavior of the autoscaling group via console, I can copy with the mouse the name of the instance and log in via ssh:
the
In more complex cases it is necessary to have to write non-trivial code. For example, to obtain the output parameters of the cloud formation stack as follows:
the
Now I can get the connection parameters with the database team
the
Transfer json in the command aws cli also brings surprises. For example, trying to add a record in dynamodb table just fails:
the
That is, somewhere deep aws cli interpretiruya this string before passing it json library. (If this were bash, I would understand, but why would need to program in Python is beyond my comprehension.)
To get around this odd behavior is enough to escape the special characters:
the
Basically, PowerShell is and native AWS-client, but I thought it was more complicated and worse documented.
Article based on information from habrahabr.ru
AWS CLI, command line interface, well-suited for this. Of course, you can write the application in Scala, but in everyday tasks better oboitis without "heavy artillery".
Team AWS can return data in different formats, including json. You can use bash and jq, but the latter is not in the cygwin repository, and install laziness. Meanwhile, in PowerShell has great support for json! The truth was that to use this is not very simple.
the
C:\Users\mpotanin> $instance = aws ec2 describe-instances --instance-ids i-ecf1fe5c
C:\Users\...> ConvertFrom-Json $instance
ConvertFrom-Json : Cannot convert 'System.Object[]' to the type 'System.String' required by parameter 'InputObject'. The specified method is not supported.
At line:1 char:19
+ ConvertFrom-Json $instance
+ ~~~~~~~~~
+ CategoryInfo : InvalidArgument: (:) [ConvertFrom-Json], ParameterBindingException
+ FullyQualifiedErrorId : CannotConvertArgument,Microsoft.PowerShell.Commands.ConvertFromJsonCommand
The fact that json is provided in the form of several lines, and is treated as an array. Feed it in PowerShell in different ways:
the
$instance | ConvertFrom-Json
ConvertFrom-Json ($instance -join "")
ConvertFrom-Json "$instance"
To get the right information now not difficult:
the
(aws ec2 describe-instances --instance-ids i-ecf1fe5c | ConvertFrom-Json).Reservations[0].Instances[0].PublicIpAddress
In simple cases, the extraction of the information may be turned over to aws:
the
aws ec2 describe-instances --instance-ids i-ecf1fe5c --query 'Reservations[0].Instances[0].PublicIpAddress' | ConvertFrom-Json
or even
the
aws ec2 describe-instances --instance-ids i-ecf1fe5c --query 'Reservations[0].Instances[0].PublicIpAddress' --output=text
Thus, observing the strange behavior of the autoscaling group via console, I can copy with the mouse the name of the instance and log in via ssh:
the
function get-address([String]$instanceId) {
aws ec2 describe-instances --instance-ids $instanceId --query 'Reservations[0].Instances[0].PublicIpAddress' | convertfrom-json
}
ssh-i devkey.pem -l ubuntu (get-address i-ecf1fe5c)
In more complex cases it is necessary to have to write non-trivial code. For example, to obtain the output parameters of the cloud formation stack as follows:
the
function reduce ($f,$a) {
if ($a.Length-eq 1) {
$a
} else {
$p = $a[0]
foreach ($x in 1..($a.Length-1)) {
$p = $f.Invoke($p,$a[$x])[0]
}
$p
}
}
function get-json($data) {
ConvertFrom-json "$data"
}
function get-cf-outputs($cfName) {
$o = (get-json(aws cloudformation describe-stacks --stack-name $cfName)).Stacks[0].Outputs
$r = foreach($i in $o) { @{$i.OutputKey = $i.OutputValue} }
reduce {param($a,$i); $a+$i } $r
}
Now I can get the connection parameters with the database team
the
(get-cf outputs microservice-rds-dev).DatabaseEndpoint
Transfer json in the command aws cli also brings surprises. For example, trying to add a record in dynamodb table just fails:
the
C:\Users\...> aws dynamodb put-item --table-name tableName --item '{"groupId":{"S":"5"}, "ancestors":{"L":[{"S":"5"},{"S":"0"}]}}'
Error parsing parameter '--item': Invalid JSON: Expecting property name enclosed in double quotes: line 1 column 2 (char 1)
That is, somewhere deep aws cli interpretiruya this string before passing it json library. (If this were bash, I would understand, but why would need to program in Python is beyond my comprehension.)
To get around this odd behavior is enough to escape the special characters:
the
function put-item([string]$table, $data) {
aws dynamodb put-item --table-name $table --item (convertto-json -Depth 128 $data).Replace('\','\\').Replace('"','\"')
}
Basically, PowerShell is and native AWS-client, but I thought it was more complicated and worse documented.
Комментарии
Отправить комментарий