Jump to content
Sign in to follow this  
ajoo

deploy a database to the aws server

Recommended Posts

Hi all !

 

I wonder if there is some way to deploy a database to the aws ec2 instance. No where have I found any reference / tutorial/ post to the automated deployment of the database in aws. I am sure it can be accomplished using a combination of a the yml file and a bash script but I have next to none experience with both. 

 

Any help is sincerely appreciated !

 

Thanks all !

 

Share this post


Link to post
Share on other sites

Use an AfterInstall hook (or another appropriate one) to run a shell script that runs a SQL file (containing the commands needed to create the database) through the database.

hooks:
	AfterInstall:
		- location: path/to/database-installer.sh
#!/bin/sh
mysql < path/to/installer.sql
/* installer.sql */
CREATE DATABASE foo;
USE foo;

CREATE TABLE bar...
You may even be able to put the command right in the appspec file itself - it depends how that mechanism works, but it's worth a try.

- location: mysql < path/to/installer.sql
  • Like 1

Share this post


Link to post
Share on other sites

Hi requinix, 

 

Ya that was more or less that I had in mind. The database dump,i.e. the database.sql file can be place inside the project and transferred to a given folder outside the root using the instructions in the appspec file and then the bash script can simply transfer that to the database as shown by you.

 

The one issue that I see is providing the database password. How can that be achieved and securely at that? Sending the password along with the files would not be a good idea at all I think. 

 

hmmm why would I need something like the script below:

/* installer.sql */

 

 Once the database is "undumped" in mysql, that it ! we are ready to use it. Maybe I am missing out something there.

 

 

 

You may even be able to put the command right in the appspec file itself - it depends how that mechanism works, but it's worth a try.

 

Ya that's correct, like I mentioned above.

 

So basically it's finally about the username and password  and providing that safely to the undumping script through a bash again. 

What do you think and suggest ?!!

 

Thanks loads !

Edited by ajoo

Share this post


Link to post
Share on other sites

If you're running MySQL 5.7 (maybe earlier?) then you should be able to connect as root without a password by default. And I think the CodeDeploy stuff runs as root, too.

Share this post


Link to post
Share on other sites

Hi, Thanks for the reply !

 

No I think that won't be. the database compulsorily asks for the password. The database password is set by me. So it's bound to ask that. The mysql that is installed is 5.7.x as well.

 

Thanks

Edited by ajoo

Share this post


Link to post
Share on other sites

root's password? How does it get set? And have you confirmed that you cannot

$ sudo -i
# mysql
to get in? Even if you set a password?

 

I'm thinking you can create a temporary user with full access and no password, run commands with that, then have it DROP USER itself when it's done.

Share this post


Link to post
Share on other sites

 

 

$ sudo -i

# mysql

is that some sort of a bash script or just linux commands?  is the # a mistype for $, the prompt?

 

I'll think up this some more and actually try it out. I'll see how far I can go and then revert.

 

Thanks loads for the ideas !

Share this post


Link to post
Share on other sites

is that some sort of a bash script or just linux commands?  is the # a mistype for $, the prompt?

 

I'll think up this some more and actually try it out. I'll see how far I can go and then revert.

 

Thanks loads for the ideas !

 

From my experience:

 

$ is user prompt

# is root prompt

Edited by jbrown1028
  • Like 3

Share this post


Link to post
Share on other sites

Hey ! Thanks !

 

Just my very limited knowledge of linux !  :)

Share this post


Link to post
Share on other sites

Here's the result of the commands suggested by requinix that I tried out :

 

 

vagrant@vagrant:~$ sudo -i

 
root@vagrant:~# mysql
ERROR 1045 (28000): Access denied for user 'root'@'localhost' (using password: NO)
 
root@vagrant:~# mysql -u root -p
Enter password:
 
Welcome to the MySQL monitor.  Commands end with ; or \g.
Your MySQL connection id is 69
Server version: 5.5.55-0ubuntu0.14.04.1 (Ubuntu)
 

 

Thanks

Share this post


Link to post
Share on other sites

Ah, see you're running MySQL 5.5. That's before they changed how root access works.

 

I just remembered that you're using CodeDeploy on an existing instance. That means you can do whatever you want to the rest of the instance ahead of time.

 

Thing is, to be able to create databases and tables and whatnot there has to be a user capable of doing that. Or, there has to be a way to create such a user automatically when it's needed (then drop it when it isn't). That necessarily implies a security risk - someone on the instance will be able to do the same thing and get the same level of access.

 

Help me help you: What are you using CodeDeploy on an existing instance for? Why isn't this creating new instances? What all is the deployment process going to be doing?

Share this post


Link to post
Share on other sites

Hi Requinix !

 

Thanks for the response !

 

 

Help me help you: What are you using CodeDeploy on an existing instance for? Why isn't this creating new instances? What all is the deployment process going to be doing? 

 

I am not deploying it on an existing instance. The deployment process creates a new instance, destroying the old one, for each deployment. Very smooth if there is no DB involved. So every time a new deployment takes place then if, there is a new database, I want it to be deployed as well which is what it is not doing as of now. Aws documentaion does not say anything about it so far as I can tell. There documentation is great but soooo vast that it is inhibitive :o !! Maybe you can help me find the apposite document like you did earlier ( thanks), in that mountain of information !!

 

Please note that for deployment, I am not using aws per se  but through bitbucket. So for the deployment I don't even have to be logged into aws. It hard code the credentials in aws and I guess bitbucket so that they authenticate each other.

 

So once the instance is created, its yml should carry the database dump and a bash script is needed to then "undump", to coin a term if not already exists :happy-04: , the sql. There should also be a trigger for the bash to execute on instance creation and then because my database in my application works with a password only ( coz I have added one for security and also for upward compatabitity), a way to securely provide the that.

 

Or maybe It would be better or plain simple to ssh into the instance, once it is created, and then run a bash script to update the database with the new .sql !

 

Those are my thoughts ! Grateful for any advise help !

 

Thanks !

Share this post


Link to post
Share on other sites

Okay, that does make more sense, but I think that means some of what I said in the other thread isn't quite applicable.

 

There's a really obvious solution to this: use RDS. Put your database in RDS and none of this has to matter.

Share this post


Link to post
Share on other sites

hmmm how would that automate the changes to the database on formation of a new instance ?!! Plus it would cost an extra instance and the cost of using the RDS which may not be immediately required or never required for smaller projects. Do you not think it could be accomplished using the yml and the bash script ? 

Share this post


Link to post
Share on other sites

I think it could be (I gave a suggestion earlier) but I don't think it should be. I mean, you're destroying the database on every deploy...

 

You're already paying more for storing files in S3. If you wanted to save costs then a manual solution with a git fetch+checkout would be cheaper and easier. Really, you're getting into more sophisticated solutions here and you should seriously think about whether higher prices are worth the added benefits.

  • Like 1

Share this post


Link to post
Share on other sites

 

 

I think it could be (I gave a suggestion earlier) but I don't think it should be. I mean, you're destroying the database on every deploy...

 

yes I get your point. As the size of the DB increases, it would start to effect performance. But as of now, since it's still development and the DB is not even over an 2 MB, it doesn't matter. Once it's ready (enough) maybe then It could just be ported and deployed to a VPS and handled manually. Right now this really eases the development.  

 

 

 

 If you wanted to save costs then a manual solution with a git fetch+checkout would be cheaper and easier.

 

Could you make a suggestion please. What would be the manual solution?  You mean just a VPS , like the EC2 instance I already have? 

 

Thanks !

Share this post


Link to post
Share on other sites

So you're deploying new instances for development? Why? Why not just update the files on a single instance?

 

If this is development then exposing a password shouldn't be a problem, right? It's just a dev machine.

Share this post


Link to post
Share on other sites

 

 

So you're deploying new instances for development? Why? Why not just update the files on a single instance?

 

Ok yes, Like I mentioned earlier, the codedeploy deployment functions like that. It first launches the new update in a new instance and then deletes the old one allowing for no downtime at all. I am not doing it, aws does it. 

 

I am doing this since it is very cool, convenient. Easy development directly on aws plus practice on services that I could use later. Oh by the way I have checked out RDS too. That keeps the database separated.  So if there is a change in the DB, it gets updated on RDS and the changes in code deployed only effect EC2 instance.

 

 

 

So you're deploying new instances for development? Why? Why not just update the files on a single instance?

 

hmmmm I am not too sure about that. You would probably know better. This is development on a live server in a way I suppose so security should be a concern. Why would I want to expose a password even on a development machine ? My instance is world exposed. Do let me know if I have the wrong ideas.

 

Thanks !

Share this post


Link to post
Share on other sites

This is development on a live server

Stop doing that. If you want to try doing this for "practice" in a way you can "use later" then do it right: set up a system for development that is completely and utterly separated from production. You don't have to use VPCs but at least put your development and production instances into different security groups that cannot talk to each other.

 

Don't half-ass this. Either you do it the real world way or you do it the easy way.

 

Why would I want to expose a password even on a development machine ?

Because it doesn't matter. It shouldn't matter. If it does matter then there is something wrong and you have to address that.

 

And we're not talking about exposing a password. We're talking about putting the file in a script that runs once on startup. You can always have the file delete itself when it's done.

 

My instance is world exposed.

Don't do that either. When you make the development security group I said, block access to everyone but your own IP address.

 

Besides, you're using SSH keys so it's highly unlikely anyone will get in. And if they do, them getting the password to the dev database is the least of your concerns.

Share this post


Link to post
Share on other sites

All the cool kids use Docker or Kubernetes these days. Might want to check them out as an alternative.

 

I do have to question the idea of an app server + DB monolithic stack. That is not the typical way that a scalable application is created.

 

Usually you have 1 -> n app servers per db, so you need an architecture to facilitate that. To start out with a plan that involves a DB on localhost per app server glosses over all the issues that your production architecture will have.

  • Like 1

Share this post


Link to post
Share on other sites

Hi !!
 

@ requinix : 

 

 

This is development on a live server

Sorry, what I meant was or / was referring to, was the server configuration in terms of the linux, apache, php etc.  When I develop on a Xampp and then port to the aws environment, it never really works at the first go since there are numerous changes that I have sometimes to make. The SQL normally breaks in a number of queries etc. But when I am using an aws instance to develop on, I don't have to make these numerous changes. Sorry I used the wrong terminology.

 

 

 

My instance is world exposed.

Again I was referring to the EC2 instance which has to be world exposed. Not mysql, which is only accessible through my local machine IP using ssh !

 

@Gizmola : 

 

 

All the cool kids use Docker or Kubernetes these days.

 

Thanks for the tip, I'll check it out. 

 

 

 

To start out with a plan that involves a DB on localhost per app server glosses over all the issues that your production architecture will have. 

 

hmmm I intend to use aws elastic load balancer eventually. 

 

Thanks !

Share this post


Link to post
Share on other sites

When I develop on a Xampp and then port to the aws environment, it never really works at the first go since there are numerous changes that I have sometimes to make. The SQL normally breaks in a number of queries etc.

That's an environmental thing (obviously) but not something you have to live with. Make sure you have about the same version of MySQL, the same version of PHP, and all the necessarily packages and applications installed. Besides that, it's basically just using forward slashes for paths in your code and everything should work.

 

But when I am using an aws instance to develop on, I don't have to make these numerous changes.

Unless you do need the big deployment process, it might be easier to just set up Dropbox or BitTorrent Sync to keep files updated in both places. Then you write code locally, wait a second for it to sync, and your AWS instance has the updated code. Or manually have your IDE do an SSH/FTP upload when you want to push changes.

Share this post


Link to post
Share on other sites

Hi !

 

 

 

it might be easier to just set up Dropbox or BitTorrent Sync to keep files updated in both places.

Thanks for this tip. I have no clue how this could be done but i'll look it up. Maybe you can point me to a tutorial !

 

However as of now I have managed to set up the project locally and on aws and have used the hooks successfully to deploy the database using the bash scripts. Even if I do not use for continuous deployments it servers as a great way to setup the code with the database initially for the first time. Everything is ready at one go. 

 

Thanks to all for all the inputs !

Share this post


Link to post
Share on other sites

I never use xamp or mamp or any of that stuff. Use Vagrant instead.

 

For your Amazon infrastructure this is what you should have:

 

- Start with a VPC, with subnets in it.

- Your app servers go into one subnet

- Data/databases go into another subnet

- Firewall Rules allow only mysql connections (3306) from app subnet to data subnet.

-- You might want to have caching servers like REDIS in the data subnet, or you can have a separate subnet for those. In general the more isolation the better.

 

When you configure your ELB it will only allow HTTP/HTTPS traffic to servers in the app subnet.

 

Essentially your app server subnet is your DMZ, and all other servers (DB's, caching, queueing, etc) would be in private subnets.

 

Once you start looking at this, you'll see that there is no way the DB should be on the same server as the app server assuming you really want to simulate a production environment. Vagrant is good for doing this, as is docker. With vagrant you can run separate vagrants attached to the same bridge network, or you can do a multi-machine vagrant: https://www.vagrantup.com/docs/multi-machine/

 

With docker you can make a docker-compose.yml and use docker-compose to orchestrate everything locally. Vagrant also offers docker support now, so you can frontend docker if you want with vagrants.

 

Regardless, there is no good reason to develop against a windows environment when you can use virtualization to run an os that matches your deployment environment. Personally I use amazon linux for all my aws ec2's, and since that is closest to centos, I'd recommend centos images as your base, which will make any provisioning code transferable.

Share this post


Link to post
Share on other sites

Hi Gizmola !

 

Thanks for the information.

 

I do have a few questions.

 

 

 

- Your app servers go into one subnet
- Data/databases go into another subnet

 

Does this imply that there need to be two server instances, one the application server and the other the database server? Or can the two subnets be created on the same single instance?

 

So essentially docker and vagrant are the same / similar technology doing nearly the same job? Which do you recommend?

 

I am using vagrant and also doing development on a VM. However I am still more comfortable with xampp because I am very new to the linux environment. But I have begun to use the VM more because vagrant allows me the comfort of developing on editors like Notepad++ which I really am comfortable with. I guess I'll wean off the xampp soon though. 

 

Thanks again !

Edited by ajoo

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Restore formatting

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Sign in to follow this  

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.