mongoid group by query


stages =  [
   { "$match" => { column_name: "Value" } },
   { "$group" => {  
             "_id" => { 
                  "name_of_your_choice"=>"$column_name",
                  "year" => { "$year" => "$column_name" },
                  "month" => { "$month" => "$column_name" },
                  "day" => { "$dayOfMonth" => "$column_name" },
                  "hour"=> {"$hour" => "$column_name"}
              },
          },
         "get_avg_of_grouped_records" => { 
              "$avg" => "$column_name" 
          },
         "count" => { "$sum" => 1 }
      }

  }
]

@array_of_objects = ModelName.collection.aggregate(stages, {:allow_disk_use => true})

Query stages
$match stage will apply conditions on query before fetching, so place it before other stages.
$group stage is used to group collection data, “_id” is used to set options on which you want to group records.
Error: A pipeline stage specification object must contain exactly one field. (16435)
As stages variable is array and you need to specify separate hash for each stage in it. Like in above query $match and $group are two stages both are placed in separate hashes and are separate elements of stages array.

Tagged with: , ,
Posted in mongoid

Weka create instance and training data using arff

Read training data from arff file:


public static Instances get_instances_from_arff() throws Exception {
	BufferedReader breader = null;
	breader = new BufferedReader( new FileReader(System.getProperty("user.dir")+"/src/weka_usage/exploration_tracks.arff") );	
	Instances training_data = new Instances(breader);
	training_data.setClassIndex(training_data.numAttributes() -1);
	breader.close();	
	return training_data;
}

Create new instance with new provided attributes:



public static Instance create_instance(double[] attr, Instances training_data){
	// Create the instance	
	DenseInstance inst = new DenseInstance(4);
	//Instance inst = new DenseInstance(4);
	inst.setValue(0, attr[0]); //web
	inst.setValue(1, attr[1]); //db
        inst.setValue(2, attr[2]); //arrival rate																																																																																																					                               
	inst.setValue(3, attr[3]); // response time
	
	inst.setDataset(training_data); // assosiate training data with instance to help in its classification
	return inst;
}

build_classifier method:


public static AbstractClassifier build_classifier(String type, Instances data) throws Exception{

	if(type == "RandomForest"){
		RandomForest rF = new RandomForest();
		rF.buildClassifier(data);
		return rF;
	}
	else if(type == "MultilayerPerceptron"){
		MultilayerPerceptron rF = new MultilayerPerceptron();
		rF.buildClassifier(data);
		return rF;
	}
	else if(type == "LinearRegression"){
		LinearRegression rF = new LinearRegression();
		rF.buildClassifier(data);
		return rF;
	}
	else{
		//GaussianProcesses is default
		GaussianProcesses rF = new GaussianProcesses();
		rF.buildClassifier(data);
		return rF;			
	}
	
}
Tagged with: , , ,
Posted in Weka

Attach EBS volume as root to EC2 instance Amazon

Amazon EBS Device Naming Conventiona

Attach the volume to the existing instance by following these steps:

  1. Stop Your Instance.
  2. Create a snapshot of the root volume.
  3. Create a new volume using the snapshot. [In case of ubuntu name it /dev/sda1]
  4. Detach Amazon EBS old root volume from already stopped instance, by right clicking on old EBS volume.
  5. Reattach the new Amazon EBS volumes to the instance.
  6. Start your Instance.

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-attaching-volume.html
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ebs-using-volumes.html
http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/ec2-add-volume-to-instance.html

Tagged with: , , ,
Posted in amazon, EBS (Elastic Block Storage)

Mongo::Error::NoServerAvailable OR mongod stoped working rails OR Insufficient free space for journals, terminating

First I tried to remove mongod lock, but it did not worked:


sudo rm /var/lib/mongodb/mongod.lock
sudo service mongodb restart

Then I tried to change permission for “/tmp” folder


ls -lh /tmp
chown root:root /tmp
chmod 1777 /tmp
sudo service mongodb restart
tail -f /var/log/mongodb/mongod.log

It did not worked but guided me towards the problem.

Problem:


Insufficient free space for journal file
Please make at least 3379MB available in /var/lib/mongodb/journal or use --smallfiles
Insufficient free space for journals, terminating
now exiting #It means mongo is stopping itself
shutdown: going to close listening sockets...
removing socket file: /tmp/mongodb-27017.sock

Solution:

sudo nano /etc/mongod.conf

and add


storage:
   mmapv1:
      smallFiles: true

Now specify for mongod to “Use the Configuration File” using command below

mongod -f /etc/mongodb.conf

In other terminal tab open log file

tail -f /var/log/mongodb/mongod.log

Restart mongod

sudo service mongod restart

log file will contain “connection now open” if everything is fine.

Side Notes:
1. To check mongod install version

mongod --version

2. mongod.conf other options


# Where and how to store data.
storage:
  dbPath: /var/lib/mongodb
  journal:
    enabled: true
#  engine:
  mmapv1:
   smallFiles: true  #OPTION ADDED FOR SMALL FILES
#  wiredTiger:

# where to write logging data.
systemLog:
  destination: file
  logAppend: true
  path: /var/log/mongodb/mongod.log

# network interfaces
net:
  port: 27017
  bindIp: 127.0.0.1

http://stackoverflow.com/a/8479630/1222852
https://docs.mongodb.org/manual/reference/configuration-options/
Mongodb Small file options setting for different versions

Tagged with: , , ,
Posted in mongodb, mongoid

httperf sending requests


httperf --hog --server localhost --port 80 --wsesslog 1,1,req.txt --rate 1 --num-con 1 --num-call 1 --timeout 5 --add-header="Content-Type: application/x-www-form-urlencoded\n" --print-reply

To get reply of header or body only use this
–print-reply=header OR –print-reply=body

req.txt file content


/PHP/ think=2.0
/PHP/register.html

/PHP/sell.html                                                
/PHP/BrowseCategories.php method=POST contents="nickname=root&password=root"

/PHP/XYZ.php method=POST contents="nickname=root&password=root"

http://www.hpl.hp.com/research/linux/httperf/wisp98/html/doc003.html
http://www.mervine.net/performance-testing-with-httperf
http://www.hpl.hp.com/research/linux/httperf/httperf-man-0.9.txt
http://jairtrejo.mx/blog/2014/04/performance-testing-with-httperf
https://gist.github.com/FZambia/5599483

Tagged with: ,
Posted in httperf, performance testing

Rails production environment assets loading

In Gemfile add:

gem 'rails_12factor'

In config/environments/production.rb add


config.assets.compile = false #To stop run time assets precompile in production.
config.assets.digest = true  #To access assets which are precompiled and in their names have appendend digets by rails
config.cache_classes = true #allowing caching assets

In config/application.rb add

replace this

Bundler.require(:default, Rails.env)

with this

Bundler.require(:default, :assets, Rails.env)

Also add these lines


# Enable the asset pipeline
config.assets.enabled = true
# Version of your assets, change this if you want to expire all your assets
config.assets.version = '1.0'
Tagged with: ,
Posted in production, Ruby on Rails

Downloading Objects from Amazon S3 using the AWS SDK [API V2] for Ruby

Set bellow variables in your project or as environment variables or whatever way you wanted.


AWS_ACCESS_KEY_ID = 'S3 bucket access key id'
AWS_SECRET_ACCESS_KEY= 'S3 bucket secret access key'
AWS_REGION = 's3 bucket region'
AWS_BUCKET= 'bucket name'

Follow any of two given ways to download objects to your local system.


s3 = Aws::S3::Client.new
s3.list_objects(bucket: 'AWS_BUCKET NAME HERE').each do |response|
  response.contents.each do |obj|
    File.open("#{Rails.root}/#{obj.key}", 'wb') do |file|
      s3.get_object( bucket: 'AWS_BUCKET NAME HERE', key: obj.key , response_target: file)
    end
  end
end

s3 = Aws::S3::Client.new
bucket = Aws::S3::Bucket.new('AWS_BUCKET NAME HERE')
bucket.objects.each do |obj|
  File.open("#{Rails.root}/#{obj.key}", 'wb') do |file|
    s3.get_object( bucket:ENV[:AWS_BUCKET], key: obj.key , response_target: file)
  end
end

get_object-instance_method V2 documentation
Official AWS SDK GEM FOR RUBY
For More Interesting Things

Posted in amazon, S3

rbenv install 2.2.1 ruby not working

After running this command:

rbenv install 2.2.1

Faced error below:

Installing ruby-2.2.1…

BUILD FAILED (Ubuntu 14.04 using ruby-build 20150928-2-g717a54c)

Inspect or clean up the working tree at /tmp/ruby-buil

Install ‘libffi-dev’ package using command below, then above command will work:

sudo apt-get install libffi-dev

Posted in missing liberaries, Ruby on Rails, Ubuntu

User does not respond to ‘devise’ method

In config/initializers/devise.rb

Replace this:

require 'devise/orm/active_record'

with this:

require 'devise/orm/mongoid'

Replace content inside block:


Devise.setup do |config|
...
require 'devise/orm/mongoid'
...
end
Posted in Devise, mongoid

Latex Bibliography not work [Make it work]

If you are using command line:


pdflatex filname.tex
bibtex filname.tex
pdflatex filname.tex
pdflatex filname.tex

OR

if you are using Texmaker:

Open your tex file and run these steps using its gui:


Select => PDFLaTeX and compile
Select => BibTeX and compile
Select => PDFLaTeX and compile
Select => PDFLaTeX and compile
Tagged with:
Posted in Latex
StackOverFlow
Categories
Archives