Showing posts with label k. Show all posts
Showing posts with label k. Show all posts

Thursday, September 21, 2017

U K game industry already feeling Brexit pinch

U K game industry already feeling Brexit pinch


New survey shows concerns in world�s sixth largest games market

Continue reading�



from Polygon - All
http://bit.ly/1MhhK43

download file now

Read more »

Saturday, September 16, 2017

U N K L E DISCOGRAPHY TORRENT by Boumil

U N K L E DISCOGRAPHY TORRENT by Boumil


Name:U.N.K.L.E. Discography Torrent
File size:21 MB
Date added:November 19, 2013
Price:Free
Operating system:Windows XP/Vista/7/8
Total downloads:1092
Downloads last week:68
Product ranking:★★★★★










She has never been in a car driven by him. Im saving money for a car. Hi Barbara. Write me sometime, OK? Your things are all here. I have a lot of things to tell you. Youve set a bad example. I think she will divorce him. Can I take rice and curd? Im the tallest one in the class.
U.N.K.L.E. Discography Torrent: - He has been sick in bed all week.
- You arent permitted to bring dogs into this building.
- She dumped him.
- Never in a million years!
- Lets take a break.
- Yeah, things are going really good.
- A boy came running toward me.
- I know that it is highly unlikely that well be able to sell all this stuff.
- If at first you dont succeed...
- U.N.K.L.E. DISCOGRAPHY TORRENT
- She must have been very young when she wrote this poem.
It cost a lot of money to repair the car. The baby is still sleeping. How do you think I can convince her to spend more time with me? She spends all her time thinking about boys. I had an asthma attack. You could have come by auto. Ill have a cup of tea please. The storm didnt cause any damage. How about making me a cup of tea? Are you going to stay long?






U.N.K.L.E. Discography Torrent charts






U.N.K.L.E. Discography Torrent location




U.N.K.L.E. Discography Torrent google search
U.N.K.L.E. Discography Torrent ask google support
U.N.K.L.E. Discography Torrent chrome extensions
U.N.K.L.E. Discography Torrent for android
U.N.K.L.E. Discography Torrent videos
U.N.K.L.E. Discography Torrent twitter search
U.N.K.L.E. Discography Torrent wiki
U.N.K.L.E. Discography Torrent photos


U.N.K.L.E. Discography Torrent world






U.N.K.L.E. Discography Torrent youtube videos






U.N.K.L.E. Discography Torrent bing photo search




U.N.K.L.E. Discography Torrent, Inc. 12734 Lowell Street, Massachusetts 4009 - USA, CA 01853 Tel: 709-218-8449 - Fax 970-303-2580 E-mail:Ana_Boumil@gmail.com
U.N.K.L.E. Discography Torrent address




U.N.K.L.E. Discography Torrent


U.N.K.L.E. Discography Torrent world

download file now

Read more »

Monday, September 4, 2017

Trident ML Clustering using K Means

Trident ML Clustering using K Means


This post shows some very basic example of how to use the k means clustering algorithm in Trident-ML to process data from Storm Spout.

Firstly create a Maven project (e.g. with groupId="com.memeanalytics" artifactId="trident-k-means"). The complete source codes of the project can be downloaded from the link:

https://dl.dropboxusercontent.com/u/113201788/storm/trident-k-means.tar.gz

For the start we need to configure the pom.xml file in the project.

Configure pom.xml:
Firstly we need to add the clojars repository to the repositories section:

<repositories>
<repository>
<id>clojars</id>
<url>http://clojars.org/repo</url>
</repository>
</repositories>

Next we need to add the storm dependency to the dependencies section (for storm):

<dependency>
<groupId>storm</groupId>
<artifactId>storm</artifactId>
<version>0.9.0.1</version>
<scope>provided</scope>
</dependency>

Next we need to add the strident-ml dependency to the dependencies section (for k-means clustering):

<dependency>
<groupId>com.github.pmerienne</groupId>
<artifactId>trident-ml</artifactId>
<version>0.0.4</version>
</dependency>

Next we need to add the exec-maven-plugin to the build/plugins section (for execute the Maven project):

<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<goals>
<goal>exec</goal>
</goals>
</execution>
</executions>
<configuration>
<includeProjectDependencies>true</includeProjectDependencies>
<includePluginDependencies>false</includePluginDependencies>
<executable>java</executable>
<classpathScope>compile</classpathScope>
<mainClass>com.memeanalytics.trident_k_means.App</mainClass>
</configuration>
</plugin>

Next we need to add the maven-assembly-plugin to the build/plugins section (for packacging the Maven project to jar for submitting to Storm cluster):

<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>2.2.1</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
<archive>
<manifest>
<mainClass></mainClass>
</manifest>
</archive>
</configuration>
<executions>
<execution>
<id>make-assembly</id>
<phase>package</phase>
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>

Implement Spout for training data 

Once the pom.xml update is completed, we can move to implement the RandomFeatureSpout which is the Storm spout that emits batches of training data to the Trident topology:

package com.memeanalytics.trident_k_means;

import java.util.ArrayList;
import java.util.List;
import java.util.Map;

import com.github.pmerienne.trident.ml.core.Instance;
import com.github.pmerienne.trident.ml.testing.data.Datasets;

import backtype.storm.task.TopologyContext;
import backtype.storm.tuple.Fields;
import storm.trident.operation.TridentCollector;
import storm.trident.spout.IBatchSpout;

public class RandomFeatureSpout implements IBatchSpout{

private int batchSize=10;
private int numFeatures=3;
private int numClasses=3;

public void open(Map conf, TopologyContext context) {
// TODO Auto-generated method stub

}

public void emitBatch(long batchId, TridentCollector collector) {
// TODO Auto-generated method stub
List<Instance<Integer>> data = Datasets.generateDataForMultiLabelClassification(batchSize, numFeatures, numClasses);


for(Instance<Integer> instance : data)
{
List<Object> values=new ArrayList<Object>();
values.add(instance.label);

for(double feature : instance.getFeatures())
{
values.add(feature);
}
collector.emit(values);
}

}

public void ack(long batchId) {
// TODO Auto-generated method stub

}

public void close() {
// TODO Auto-generated method stub

}

public Map getComponentConfiguration() {
// TODO Auto-generated method stub
return null;
}

public Fields getOutputFields() {
// TODO Auto-generated method stub
return new Fields("label", "x0", "x1", "x2");
}
}

As can be seen above, the RandomFeatureSpout is derived from IBatchSpout, and emits a batch of 10 tuples at one time, each tuple is a training record containing the fields ("label", "x0", "x1", "x2"). The label is integer, while x0, x1, x2 are double values. the training records are obtained from Trident-MLs DataSets.generateDataForMultiLabelClassification() method.

K-means in Trident topology using Trident-ML implementation

Once we have the training data spout, we can build a Trident topology which uses the training data to create a class label for each of the data record using k-means algorithm in Trident-ML. This is implemented in the main class shown below:

package com.memeanalytics.trident_k_means;

import java.util.ArrayList;
import java.util.List;
import java.util.Random;

import com.github.pmerienne.trident.ml.clustering.ClusterQuery;
import com.github.pmerienne.trident.ml.clustering.ClusterUpdater;
import com.github.pmerienne.trident.ml.clustering.KMeans;
import com.github.pmerienne.trident.ml.core.Instance;
import com.github.pmerienne.trident.ml.preprocessing.InstanceCreator;
import com.github.pmerienne.trident.ml.testing.data.Datasets;

import storm.trident.TridentState;
import storm.trident.TridentTopology;
import storm.trident.testing.MemoryMapState;
import backtype.storm.Config;
import backtype.storm.LocalCluster;
import backtype.storm.LocalDRPC;
import backtype.storm.generated.AlreadyAliveException;
import backtype.storm.generated.InvalidTopologyException;
import backtype.storm.generated.StormTopology;
import backtype.storm.tuple.Fields;

public class App
{
public static void main( String[] args ) throws AlreadyAliveException, InvalidTopologyException
{
LocalDRPC drpc=new LocalDRPC();

Config config=new Config();

LocalCluster cluster=new LocalCluster();

cluster.submitTopology("KMeansDemo", config, buildTopology(drpc));

try{
Thread.sleep(10000);
}catch(InterruptedException ex)
{
ex.printStackTrace();
}



for(int i=0; i < 10; ++i)
{
String drpc_args=generateRandomTestingArgs();
System.out.println(drpc.execute("predict", drpc_args));
try{
Thread.sleep(1000);
}catch(InterruptedException ex)
{
ex.printStackTrace();
}
}

cluster.killTopology("KMeansDemo");
cluster.shutdown();
drpc.shutdown();
}

private static String generateRandomTestingArgs()
{
int batchSize=10;
int numFeatures=3;
int numClasses=3;

final Random rand=new Random();

List<Instance<Integer>> data = Datasets.generateDataForMultiLabelClassification(batchSize, numFeatures, numClasses);

String args="";
Instance<Integer> instance = data.get(rand.nextInt(data.size()));


args+=instance.label;

for(double feature : instance.getFeatures())
{
args+=(","+feature);
}


return args;
}

private static StormTopology buildTopology(LocalDRPC drpc)
{
TridentTopology topology=new TridentTopology();

RandomFeatureSpout spout=new RandomFeatureSpout();

TridentState clusterModel = topology.newStream("training", spout).each(new Fields("label", "x0", "x1", "x2"), new InstanceCreator<Integer>(), new Fields("instance")).partitionPersist(new MemoryMapState.Factory(), new Fields("instance"), new ClusterUpdater("kmeans", new KMeans(3)));

topology.newDRPCStream("predict", drpc).each(new Fields("args"), new DRPCArgsToInstance(), new Fields("instance")).stateQuery(clusterModel, new Fields("instance"), new ClusterQuery("kmeans"), new Fields("predict"));

return topology.build();
}
}
package com.memeanalytics.trident_k_means;

import java.util.ArrayList;
import java.util.List;

import backtype.storm.tuple.Values;

import com.github.pmerienne.trident.ml.core.Instance;

import storm.trident.operation.BaseFunction;
import storm.trident.operation.TridentCollector;
import storm.trident.tuple.TridentTuple;

public class DRPCArgsToInstance extends BaseFunction{

private static final long serialVersionUID = 1L;

public void execute(TridentTuple tuple, TridentCollector collector) {
// TODO Auto-generated method stub
String drpc_args = tuple.getString(0);
String[] args = drpc_args.split(",");
Integer label=Integer.parseInt(args[0]);
double[] features=new double[args.length-1];
for(int i=1; i < args.length; ++i)
{
double feature=Double.parseDouble(args[i]);
features[i-1] = feature;
}
Instance<Integer> instance=new Instance<Integer>(label, features);

collector.emit(new Values(instance));
}

}

As can be seen above, the Trident topology has a InstanceCreator<Integer> trident operation which convert raw ("label", "x0", "x1", "x2") tuple into an Instance<Integer> object which can be consumed by ClusterUpdator. The ClusterUpdate object from Trident-ML updates the underlying clusterModel via k-Means algorithm.

The DRPCStream allows user to pass in a new testing instance to the clusterModel which will then return a "predict" field, that contains the predicted label of the testing instance. The DRPCArgsToInstance is a BaseFunction operation which converts the arguments passed into the LocalDRPC.execute() into an Instance<Integer> which can be passed into the ClusterQuery which then uses kmeans and clusterModel to determine the predicted label.

Once the coding is completed, we can run the project by navigating to the project root folder and run the following commands:

> .mvn compile exec:java

download file now

Read more »

Wednesday, August 16, 2017

Ubuntu 11 04 a k a Natty Narwhal

Ubuntu 11 04 a k a Natty Narwhal


This week  I got time to upgrade my personal server which was running Ubuntu 10.10 to the new version 11.04 a.k.a. Natty Narwhal. The process wasnt really difficult as many years ago when it used to take hours! and a lot of CDs.

This time I just launched the Update Manager and the notification for the new version was already there and asking if I wanted to install it. You just need to run this command:
sudo update-manager -d

New Upgrade :)

Rlease Notes

Preparing upgrade

I pressed upgrade and after asking for the root user it started the whole process. It removed a lot of old packages and replace others with the new version. One think I really like it is that I made some modifications to some config files and the upgrade tool asked me if I wanted to preserve the old ones or replace them with the new ones. This was great because I was afraid I had to re-do that work again.
Summary of things to do
The whole process might have taken around 2 hours mostly because the Ubuntu servers were so busy with all the installations that sometimes it was really slow to download the packages needed. The traffic must have been really heavy.
Installing new version
Some Clean up
At the end everything was fine but since it is a server I couldnt run the new UI Unity it seems I have to configure my graphics card and makes sense since Unity uses Compiz to run so I might need to take some more time to fix it, either way I like the classic UI.
New Scroll UI
Another new things are the new Ubuntu Software Center , Ubuntu One which lets you synchronize everything between computers and everyone gets 2GB for free, it also lets you stream your music. I need to try this one although 2GB is not too much.
Ubuntu Control Center
Ubuntu One
Talking about software it comes with Nautilus 2.32.2, GNOME 2.32.1, Firefox 4.0, Shotwell 0.9.2, Empathy 2.34.0, Banshee 2.0, LibreOffice 3.3.2, Evolution 2.32.2, Gwibber 3.0.0.1, Xorg Server 1.10.1, X.Org 7.6, Totem 2.32.2, Compiz 0.9.4, GDM 2.32.1, GRUB 1.99 RC1, Mesa 7.10, New artwork, Linux kernel 2.6.38.3, Compiz Fusion 0.9.4 and many more. They did replace OpenOffice with LibnreOffice.
LibreOffice
You can read the complete set of new features in its official page. I like Ubuntu and I would like it more once I am able to configure and run Unity.

Enjoy the new version :)

download file now

Read more »