Daily Archives: February 20, 2021

Policies setting: aws account is able to upload to s3

Assume Role Way

  1. create a aws user. This user doesn’t has any policy.

aws_user

2. After created user account, it should tell you the ACCESS_ID and ACCESS_SECRET, copy that to somewhere.

3. create iam role. This role needs a policy, which has access to s3 bucket

aws_role

4. This role should also have trust relationship with user account we’ve just created.

iam_role_trust

5. In local, run below command.

AWS_ACCESS_KEY_ID=xxxx AWS_SECRET_ACCESS_KEY=xxxx aws sts assume-role --role-arn ${assume_role_arn} --role-session-name "RoleSession1"

Then it will output assume role key/secret/session_token. In order to achieve this, just add assuming this account in this role.

6. Copy thekey/secret/session_token and run below command, it executes s3 operations.

AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX AWS_SESSION_TOKEN=XXX aws s3 ls s3://mybucket

AWS_ACCESS_KEY_ID=XXX AWS_SECRET_ACCESS_KEY=XXX AWS_SESSION_TOKEN=XXX aws s3 cp /tmp/test.txt s3://mybucket

User Way

We can create a user, the user directly has policy to access S3 bucket. ‘

user

Then we can directly run below command to access to S3 bucket by the user credential, instead of assumeRole. But this way is not recommended way.

 

Category: aws

What is bad code, OO design from Uncle Bob

rigidity code, modules are coupled. Change in module1, requires change module2, then requires change in module3.

fragile code, change in module1, but it caused issue in other module or system, which is very unrelated to the module1. Bazzare break, weird break. Like your car window can’t be opened, mechanics fixed the window, but the car engine won’t start.

dependencies. I want to use someone’s code. The code does solve the desired problem. But the code also brings other problem, werid data structure, databases. etc.

OO: encapsulation, inheritance, polymorphism

OO weakens encapsulation. Because variable has public/private/protected/default. Original language such as C, just call the function. It was a good encapsulation.

https://www.youtube.com/watch?v=zHiWqnTWsn4&t=2642s

Bastion host configuration and private key in ~/.ssh folder,

We need to ssh to bastion host, from there, ssh to xxx.ec2.internal host. The configuration in ~/.ssh/config file is like below:

Host *.ec2.internal     // it applies to every *.ec2.internal
  User hadoop     // the default username for final host. hadoop@xxx.ec2.internal,
  IdentityFile ~/.ssh/ssh-private.key    // the private ssh key
  UseKeychain yes
  ProxyCommand ssh username@xxx.bastion-host.com -W %h:%p.     // username, bastion host

So, later we can just simply run “ssh abc.ec2.internal“, it will ssh to it by using the bastion host.

Only putting the private key there, such as:

Host *
  IdentityFile ~/.ssh/ssh-private.key
  UseKeychain yes

One line command is like:

ssh -o ProxyCommand='ssh -W %h:%p {bastion-user-name}@{bastion-host-name}' username@{target-host-ip}

.ssh/config basic

One line command is like:

ssh john@dev.example.com -p 2322

equals:

Host dev
    HostName dev.example.com
    User john
    Port 2322

Then: ssh dev

Below makes ssh remembers the password. You won’t need to type the password again.

UseKeychain yes
AddKeysToAgent yes

Matching order. https://linuxize.com/post/using-the-ssh-config-file/

Tunneling

ssh hadoop@xxxx.ec2.internal -4 -ND 8157

-4, IPv4
-ND, open local 8157 port for tunneling

Category: web