Issue
I am heavily struggling with the local-exec by Terraform. I am deploying a an EC2 instance with an own built AMI from amazon-linux-2.
I have this code:
locals {
ssh_user_home = "/home/ec2-user"
}
resource "null_resource" "rerun" {
triggers = {
rerun = uuid()
}
provisioner "file" {
source = "${path.module}/sources"
destination = "${local.ssh_user_home}/tmp"
connection {
type = "ssh"
user = "${var.aws_ssh_user}"
private_key = "${data.aws_secretsmanager_secret_version.kibana_proxy_ssh_value.secret_string}"
host = "${aws_instance.logstash.private_ip}"
}
}
provisioner "file" {
source = "./creds"
destination = "${local.ssh_user_home}/tmp"
connection {
type = "ssh"
user = "${var.aws_ssh_user}"
private_key = "${data.aws_secretsmanager_secret_version.kibana_proxy_ssh_value.secret_string}"
host = "${aws_instance.logstash.private_ip}"
}
}
provisioner "remote-exec" {
inline = [
"cd ${local.ssh_user_home}/tmp",
"cp creds/.htpasswd.${var.aws_env} creds/.htpasswd",
"bash -x sources/ansible.sh ${var.es_fqdn} ${var.kibana_domain}",
# "rm -r /tmp/creds/",
# "rm -r /tmp/sources/",
]
connection {
type = "ssh"
user = "${var.aws_ssh_user}"
private_key = "${data.aws_secretsmanager_secret_version.kibana_proxy_ssh_value.secret_string}"
host = "${aws_instance.logstash.private_ip}"
#script_path = "${local.ssh_user_home}"
}
}
}
There is always this error:
module.logstash-instance.null_resource.rerun (remote-exec): Connecting to remote host via SSH...
module.logstash-instance.null_resource.rerun (remote-exec): Host: 10.135.202.29
module.logstash-instance.null_resource.rerun (remote-exec): User: ec2-user
module.logstash-instance.null_resource.rerun (remote-exec): Password: false
module.logstash-instance.null_resource.rerun (remote-exec): Private key: true
module.logstash-instance.null_resource.rerun (remote-exec): Certificate: false
module.logstash-instance.null_resource.rerun (remote-exec): SSH Agent: false
module.logstash-instance.null_resource.rerun (remote-exec): Checking Host Key: false
mmodule.logstash-instance.null_resource.rerun (remote-exec): Connected!
Failed to upload script: scp: /tmp: Permission denied
No idea what I am doing wrong since terraform uses ec2-user
as user and everything is copied into /home/ec2-user/tmp
Solution
I have found the issue. The amazon linux 2 is hardened in a way that it does not allow ssh from an external source out of the box. I did not find a solution to soften this restriction, but I made use of user_data (with cloud-init which means the ec2 instance runs the user_data script after reboot) and everything works now :)
Answered By - WorkoutBuddy Answer Checked By - Katrina (WPSolving Volunteer)