Collecting Robots

Collecting Robots is a little project I have made as a project for a university course of Operating System for Robotics. The goal is to plan a strategy for some “robots” to collect some “objects” and taking them to some “collect points”. The robots also have a maximum capacity, i.e. they can take a maximum number of objects at once, so to go on they will need to take the objects to a collect point.

The problem is quite hard (I believe it is NP-hard) so my first thought was to use an heuristic approach. I implemented a genetic solution using the JSGenetic library I wrote some time ago.

Actually there’s a little problem with my implementation: the crossover operation is quite weak. It does not combine the two parent individual in a complete way, thus the generated children is often very similar to the parents. This moves the evolution capability towards the mutation process, which is much more random, and this means the evolution will lead more likely to a local optima. This is especially evident with large problems (many objects) when object positions are not uniformely distributed in the map.

The app is publicly available here.

Browser compatibility:

  • [Google Chrome/Chromium] OK
  • [Firefox] Some problems, “go” and “animate” button works but label doesn’t refresh (isn’t .innerText standard?), simulation is slower
  • [IE] Not tested
  • [Safari] Not tested

Some notes on usage of the app:

Map generation

You may use the randomly generated map which is presented on startup, generate a new one with the “randomize” button, or generate your own map.

To generate your map, just press “clear” to clear current map, then select the proper point you want to place on the dropdown list, and place it clicking on the map.

If you want to change the number of points used by the randomize option, you can place the point manually on the map, in the number and type you want them to be, and then pressing “randomize” will make a new random map with the number and type of points you placed.

Evolution of the solution

When the map is ready, you can press “init Evolver” to initialize the Evolver object, which is responsible for managing the evolution process.

Then, pressing “go” will start the evolution, showing the best solution found in the left canvas.

When you’re happy with the shown solution, press “stop”.

Simulation

Beware that you need to stop the evolution to run the simulation. Doing otherwise will lead to unknown results 😛

When you have a solution (you need to at least have initialized the Evolver), you can see the robots in action in the right canvas by pressing “animate”, which will start the simulation with the current solution.

JSGeneticNeuralNetwork.js experiment

This new project is about feed forward neural network with genetic evolving weights. I’ve combined my two libraries, JSGenetic and JSNeuralNetwork, plus a little module I’ve developed about continuous genetic algorithms, and developed a little library which evolves neural networks with a genetic algorithm.

A little example I’ve made to test the library can be found here, it’s pacman learning to eat.

There are also other examples in the JSNeuralNetwork page.

Naive Bayes Classifier in JS, empowering telegram webapp

Today I made a simple implementation of a naive bayes classifier in Javascript. The implementation was largely inspired by this article.

After that, I spent some time to integrate it with this Telegram webapp, and now I have a telegram webapp in which you can mark messages as ‘spam’ or ‘important’, and every message is then classified giving some confidence about the possibility of it of being part of these categories.

It saves the results to the chrome local storage, so that you can train the classifier over time, preserving the results beetween sessions.

When I will train it enough, and if the results are good, I’m looking forward to implement the auto-hiding of messages based on this data. This is intended for large telegram group chats, in which often off-topic messages will just make painful extracting important messages from the enormous amount of things people texted. How many times you look at your preferred messaging app and you find ~50 unread messages? This is meant to resolve this, even if the bayes classifier is not perfect for this task, it was easy to implement and usually gets good results. I’ll update here if I have encouraging result from this test.

Meanwhile, here’s a screenshot:

telegrambayesscreenshot

JSNeuralNetwork updates and some more experiments

I’ve updated the JSNeuralNetwork library implementing a small and still simple version of the Hopfield Network, along with some implementations.

Moreover, I implemented a new experiment based on the Kohonen networks, which is about vision, and needs a webcam to work.

 

As usual everything is here: http://www.nicassio.it/daniele/JSNeuralNetwork/.

JSNeuralNetwork first implementation

As I started attending a class of introduction to neural network at the university, I’ve been looking to create something which would help me to understand the subject better with some practice. Therefore I decided to implement a sample Neural Network library in Javascript, which will implement at least some of the networks we saw during the class.

The first one I’ve implemented is a WTA (Winner Takes All) network, and an example can be found here. There might be space for improvement, I may refine this in the future, but my goal in this blog is to point out the intuition behind what I do, not to provide tools to be used in production.

JSGenetic library and new Genetic Environment implementation

I’ve created a little Genetic Algorithm library called JSGenetic, which takes care of the population management of a genetic algorithm.

As a test for this library, I’ve re-implemented an old project, the Genetic Environment, which is basically a simulation of an environment made by resources (water and food) which spontaneously grows or rains on the field, and by “animals”, which have a genetic code that specifies their behavior given what they see of the environment: they can actually “see” only north, south, east and west. The animals also have a little memory of 4 bits, and can understand if they are running out of water or food.

The genetic code of the animals is initially set to random, and is progressively processed by the genetic library which selects the individuals which survive more time.

After some generations, it’s clear that the individuals live longer because of the selection made on their genetic code, which defines how they behave.

The code of the JSGenetic library is on GitHub, and the live example of the code is on my github page at this address.

Hierarchical clustering of blog posts fetched through RSS Feed

Today I tried to implement a simple webapp which retrieves some RSS feeds from a given URL and then looks for the content and uses a hierarchial clustering classifier to cluster them in some kind of categories by content similarity.

Actually the implementation is really poor, and I’m not even sure it works. Anyway, it’s been long since I wanted try some kind of text classifier, and here we are. It extracts the text from the RSS feed, then indexes the words inside it and tries to use them as features for the algorithm. Short posts generally means bad results, especially without any kind of generalization (tokenization should be the word in this case) of the features. In fact, results are hardly understandable and I guess they’re random.

Anyway, here’s a list of what I (sort of) learned along the way:

  • what is hierarchical clustering (not how it works, though)
  • d3.js graph library basics (very basic basics)
  • how to use NetBeans to develop webapps

That’s not so bad for a spare afternoon&evening.

As I said, I didn’t implement the algorithm myself, but I used a library from github, clusterfck.
Oh and I also used the jFeed jQuery plugin for parsing the RSS, but I slightly modified it to fetch the content of the entries and not to crash trying to detect IE.

Here‘s the link.

Genetic Shaping Layout

It’s been a long time since I decided to try to use genetic algorithms for optimizing a web page style, and here’s my first try. The idea is simple: generating some random styles and then making the user choose which one he prefer. Then using the genetic rules to combine the chosen CSS with the other’s. Actually with a ‘population’ of only six elements it doesn’t have a real genetic value, but the principle is the same.

The problem with this kind of implementation is that you need a human to select the best styles generated, and this prevents from having large population and selection. However, a possibility could be to implement this server side, taking advantage of the selection made by multiple users.

Here’s the link.

Implementation of a K Means Clustering to classify documents by language

Recently i’ve been interested in machine learning, and made some sample implementations to understand better the subject. In particular recently i’ve implemented a simplified version of the K-Means Clustering classifier and then I decided to apply this algorithm to a more practical task.

I’ve implemented the K-Means Clustering to classify some text by the language it’s written in. In brief, you can provide some different text to the algorithm, decide in how many groups you want them to be classified and then run the algorithm. It will partition the text depending on the different relative frequency of the letters in it, trying to recognize some structure in the different languages. It’s not perfect, but works, and there’s a live version to try here.

The larger the text is, the more the algorithm will be accurate. In fact, with short text the result will be quite random.