Skip to content
This repository has been archived by the owner on Sep 24, 2019. It is now read-only.

Tangerine Server (Wizard) Setup Issue #596

Open
atifrasheed79 opened this issue Aug 28, 2013 · 14 comments
Open

Tangerine Server (Wizard) Setup Issue #596

atifrasheed79 opened this issue Aug 28, 2013 · 14 comments
Assignees

Comments

@atifrasheed79
Copy link

Hello there,

I am trying to setup Tangerine Server (Wizard) setup on a local server and facing an issue. Please note that I have installed Tangerine App successfully and it's working absolutely fine on my local server so all the dependencies have been resolved. Now I just need a working WIZARD in my server app.

CouchDB is running but when I checked out development branch, executed init.sh script in app/_attachments/js, I got below error.


reading modules/navigation/NavigationView.js
reading router.js
reading boot.js
reading version.js
./uglify.rb:221:in read': No such file or directory - /home/ubuntu/Tangerine-develop/app/_attachments/js/min/version.min.js (Errno::ENOENT) from ./uglify.rb:221:inblock in

'
from ./uglify.rb:218:in `each'

from ./uglify.rb:218:in `'

So its seems probably some files like version.js and version.min.js are missing. I tried to bluff the process by creating dummy files but it didn't work :) It did partially actually, means I was able to complete the init script but when I pushed the couchapp (#Tangerine-devel/app>couchapp push) I got below URL and Wizard app didn't show up properly on this URL. Image is attached for reference.


ubuntu@ip-10-202-71-24:~/Tangerine/app$ couchapp push
2013-08-28 22:29:34 [INFO] Visit your CouchApp here:

http://10.202.71.24:5984/tangerine-develop/_design/ojai/index.html

tangerine-develop error

@adam704a
Copy link

adam704a commented Sep 3, 2013

Hello,

While we don't have the documentation available yet, you can actually build the tangerine server using the Chef scripts found here and deploy to an EC2 instance with the AWS provider. The init.sh is being deprecated. We hope to have this new process documented this week.

@ghost ghost assigned adam704a Sep 3, 2013
@atifrasheed79
Copy link
Author

Thanks for update Adam. As I understand Chef scripts create another virtual environment so why we want to create a virtual box within linux environment? Is it possible to have a simple setup which resides within our Unix environment?

Secondly, it would be a real issue if we want to deploy Tangerine on Cloud like EC2 or Google Cloud Compute as we can't create a virtual environment within a virtual environment.

@adam704a
Copy link

adam704a commented Sep 3, 2013

Whats doing the real work here isn't Chef, but rather Vagrant. Vagrant offers providers that can manage other virtual instances as well including those on AWS. So the steps would be:

  1. Locally install the vagrant-aws plugin
  2. Add a few things to your VagrantFile including adding your AMI id, key pair, and some security settings for AWS.
  3. Call vagrant up --provider=aws

As you can see, this isn't a virtual environment within a virtual environment, but rather scripts that can be used to provision a VM that is running almost anywhere. This is what will be documented in more detail later this week.

@atifrasheed79
Copy link
Author

Thanks foor update Adam but Chef also supports deployment on physical servers ! will that be supported in upcoming installation docs ?

I have followed the documentation and help you have provided so far and have tried to setup my environment but it's not going through. I created a Vagrantfile and did vagrant up but it got stuck on "waiting for the SSH to become available". I can see an instance has been initiated in my AWS space and I can even login into it using same credentials I have supplied in my Vagrant file.

I have provided couple of screenshots for your review. InstanceID i-a5cc*** is initiated by Vagrant and is successfully up. I will also retry after enabling vagrant DEBUG to see whats going wrong.

image

image

@atifrasheed79
Copy link
Author

figured it out, somehow SSH auth was not working and it worked on 4th or 5th attempt. Now I can see that my vagrant (ict-chep-repo) directory has been rsync'ed to the AWS new instance initiated by vagrant.

Now what to do, how to bring the new environment up & running ?

@atifrasheed79
Copy link
Author

I have tried it again with DEBUG and DEBUG is attached for your reference.

ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$VAGRANT_LOG=DEBUG vagrant up --provider=aws

Please note that debug file is actually text but this discuss forum only allows png,gif or jpg so please rename it to .txt before you actually try to open it.
vagrantup-debug

@atifrasheed79
Copy link
Author

Hi Adam, I have tried the updated code but same results. After "$vagrant up --provider=aws", I can login into the machine using "$vagrant ssh" but I don't see any service running, nothing in /var/www as well.

Somehow provisioner is not running in my case. I just get this warning after vagrant up finishes off and even if I try vagrant provision. Please look into it.

ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$ vagrant up --provider=aws
Bringing machine 'default' up with 'aws' provider...
[default] Warning! The AWS provider doesn't support any of the Vagrant
high-level network configurations (config.vm.network). They
will be silently ignored.
[default] Launching an instance with the following settings...
[default] -- Type: m1.small
[default] -- AMI: ami-7539b41c
[default] -- Region: us-east-1
[default] -- Keypair: M1-Large
[default] -- Block Device Mapping: []
[default] -- Terminate On Shutdown: false
[default] Waiting for instance to become "ready"...
[default] Waiting for SSH to become available...
[default] Machine is booted and ready for use!
[default] Rsyncing folder: /mnt/tangerine/ict-chef-repo/ => /vagrant
/opt/vagrant/embedded/gems/gems/vagrant-1.2.7/lib/vagrant/util/which.rb:32: warning: Insecure world writable dir /opt/vagrant/bin/.. in PATH, mode 040777
ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$ vagrant status
Current machine states:

default running (aws)

The EC2 instance is running. To stop this machine, you can run
vagrant halt. To destroy the machine, you can run vagrant destroy.
ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$
ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$
ubuntu@ip-10-202-71-24:/mnt/tangerine/ict-chef-repo$ vagrant ssh
/opt/vagrant/embedded/gems/gems/vagrant-1.2.7/lib/vagrant/util/which.rb:32: warning: Insecure world writable dir /opt/vagrant/bin/.. in PATH, mode 040777
Welcome to Ubuntu 12.10 (GNU/Linux 3.5.0-21-generic x86_64)

0 packages can be updated.
0 updates are security updates.

New release '13.04' available.
Run 'do-release-upgrade' to upgrade to it.

Get cloud support with Ubuntu Advantage Cloud Guest
http://www.ubuntu.com/business/services/cloud
*** /dev/xvda1 will be checked for errors at next reboot ***

Last login: Tue Sep 10 18:54:18 2013 from 39.41.237.74
ubuntu@ip-10-169-11-217:~$ exit
logout

@adam704a
Copy link

I updated the documentation to included a little blurb about configuring the vagrant. What was missing before was two things:
1.) the section for actually provisioning the VM in the vagrant file
2.) a plugin that installs chef on the vm (assuming its not installed already)

Its working for me now.

@atifrasheed79
Copy link
Author

Hi, thanks for the update. I have tried it and yes it works and chef is installed on the remote machine. But what next ? how actual services are going to be provisioned in the remote aws machine ?

Will appreciate if you just please explain the whole setup ! I will give a try and if worked I can write a detailed install doc to put up on the wiki page.

@adam704a
Copy link

I think the piece that we are talking about now is the vagrant-aws plugin, which is documented in the README on their github page. Basically, how this works is that when you run vagrant up --provider=aws the entire chef repository is copied to the remote VM in the /vagrant directory and chef-client is then called which provisions the server (you'll see all of this happening in your terminal window)

@atifrasheed79
Copy link
Author

Thanks for details and yes, it's working till the installation of chef-client but nothing happens afterwards. I can only see chef version command running in the newly brought up instance.

image

@atifrasheed79
Copy link
Author

Hi Adam,

I have tried the standard (virtualbox) method and it all went well except the fact that I am not able to access Tangerine :) You can see in below log that it all went smooth. I logged into the VM and I can see couchdb service running but I don't see Tangerine app deployed in /var/lib/couchdb. I still tried to access the tangerine default URL but it's not accessible. Somehow the tangerine installation didn't happen and I haven't seen any error on the stdout.

root@ubuntu:/home/atif/ict-chef-repo# vagrant up
Bringing machine 'tangerine' up with 'virtualbox' provider...
[tangerine] Clearing any previously set forwarded ports...
[tangerine] Creating shared folders metadata...
[tangerine] Clearing any previously set network interfaces...
[tangerine] Preparing network interfaces based on configuration...
[tangerine] Forwarding ports...
[tangerine] -- 22 => 2222 (adapter 1)
[tangerine] -- 8080 => 8888 (adapter 1)
[tangerine] -- 3306 => 3333 (adapter 1)
[tangerine] Booting VM...
[tangerine] Waiting for machine to boot. This may take a few minutes...
[tangerine] Machine booted and ready!
[tangerine] Mounting shared folders...
[tangerine] -- /vagrant
[tangerine] -- /tmp/vagrant-chef-1/chef-solo-2/roles
[tangerine] -- /tmp/vagrant-chef-1/chef-solo-1/cookbooks
root@ubuntu:/home/atif/ict-chef-repo#

@atifrasheed79
Copy link
Author

apologies on my insistence, but why can't I simply push WIZARD couchapp to couchdb instead of going into this chef/vagrant complex cycle ?

I successfully deployed Tangerine app on my couchdb which worked like a charm, now I just need to do same for the WIZARD app which is used to create assessments.

@adam704a
Copy link

Thanks for your patience on this stuff. We are trying to iron out this process as much as we can. When you provision your Tangerine server using these CouchDB is installed a configured running on the default port (5984). When running this on AWS, just make sure that your security group supports this configuration. Also, I just updated the Vagrantfile so that this port is forwarded properly when running this on VirtualBox. Now you will be able to access the server like this: http://localhost:5984/_utils/. Please let me know if you have any questions.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants