Nomine

Names are boring. I mean how many “Johns” do we really need in the world? I’d put a hard cap at 11.

Sorry ma’am John is taken. How about j0hnnyb0i-872?

Come on no one wants j0hnnyb0i-872. If only there were a way to find unique names without carving out an afternoon and locking yourself in your basement with your pink spiral notebook (the one with the flowers).

Well today is your lucky day, cause you have found Nomine, the neural net based random name generator! My partners and I developed this beautiful program as a final project for Linearity 2 to illustrate the useful properties of Neural Net backpropagation through gradient descent.

Before I show you how we did it, let’s check  out the results. A useful note about neural network for the unenlightened: their properties depend on what data they are trained on. So, we were able to vary the output of our program by feeding it databases of different types of names, such as male, female or dog names.  Anyway, here are some databases we used and their results. My favorite is Quigsby.

Best of 1990s Top 500 Male Names: Best of 1990s Top 500  Female Names: Best 1880s Male Names:  Best Dog Names:
  1. Frando
  2. Aclen
  3. Cisl
  4. Ichold
  5. Perlan
  6. Lero
  7. Allarman
  8. Bertin
  9. Dwintis
  10. Elbert
  11. Oshie
  12. Vencester
  1. Athlei
  2. Cyna
  3. Neotha
  4. Vella
  5. Arlen
  6. Therisene
  7. Janafrine
  8. Quelia
  9. Onna
  10. Berla
  11. Tinisa
  12. Ubira
  1. Harlius
  2. Linton
  3. Percer
  4. Artin
  5. Delbur
  6. Tharl
  7. Elipo
  8. Bertis
  9. Orche
  10. Nell
  11. Uster
  12. Vere
  1. Quigsby
  2. Baisie
  3. Jazer
  4. Pergie
  5. Coley
  6. Porley
  7. Epho
  8. Ishe
  9. Tooka
  10. Blie
  11. Illy
  12. Layla

This is a small fraction of the total names generated. To see more checkout the end of our paper linked at the bottom

Extension: So far this has only been tested on some names. This should copy the letter structure of any dictionary of words that it is given, so it should work on verbs and adjectives as well.


Finally, how it’s done. For a quick overview of backprop and how it works along with gradient descent check out this funny and informative video my partners made.

For more in-depth information on the topic check out our paper. If you want to run it yourself, check out our source code.

As always if you have any questions or want to suggest a database to train on, feel free to shoot me an email at the contact info listed below.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

w

Connecting to %s