emp's Neural Network Tutorial



NNs are not really that great outside of a few specific domains. I'd check out Hidden Markov and Support Vector Machines for more cutting edge ML and applications.
 
NNs are not really that great outside of a few specific domains.

Would you mind elaborating on this?

Apart from errors in continuous data sources, I believe a neural network can be thoroughly and rapidly optimized.

Aren't hidden markovs pretty infamous for being too blunt about their outputs? The assumptions + predictions can go pretty vague IMHO with an HMM. (We worked with HMM on tracking a guest visitors activity on a site - primarily a forum - from the time of arrival to his browsing habits to an end sale or bounce. We worked with set forums with an integrated opencart outlet. Our entire case study was based upon this research paper - pdf)

Other than that, HMM require an increased number of input parameters when compared to traditionally optimized NN options.

If we're talking probabilistic models, then I'd highly recommend checking out Mixture Models. P.S - mattseh - PyMix - The Python Mixture package | PyMix / Home - (Like a bawse)

Disclaimer - My knowledge on the subject matter isn't as profound as some of you. I am also by no means a computer scientist. I just like statistics and predictive analysis a lot and work on inhouse tools related to them from time to time. So excuse me, if either of my post on these topics come across as idiotic.
 
Would you mind elaborating on this?

Apart from errors in continuous data sources, I believe a neural network can be thoroughly and rapidly optimized.

Aren't hidden markovs pretty infamous for being too blunt about their outputs? The assumptions + predictions can go pretty vague IMHO with an HMM.

Other than that, HMM require an increased number of input parameters when compared to traditionally optimized NN options.

If we're talking probabilistic models, then I'd highly recommend checking out Mixture Models. P.S - mattseh - PyMix - The Python Mixture package | PyMix / Home - (Like a bawse)

Disclaimer - My knowledge on the subject matter isn't as profound as some of you. I am also by no means a computer scientist. I just like statistics and predictive analysis a lot and work on inhouse tools related to them from time to time. So excuse me, if either of my post on these topics come across as idiotic.

It all depends on the application and what kind of data transformation/preprocessing you do. What I meant was that NNs, while they can be used for a lot of stuff, isn't the most accurate, precise or even efficient. They are not ideal to use in classification either, especially multi-target classification.

They are however very useful in the mature domains they have been used in before like image recognition and captcha breaking.
 
+rep

You are way to fucking smart for WF. How is it you're not working for the fed's? Oh ya thats right....what did you call it a "think tank"?


Great video's thank you for taking the time to put them together and make us all a little smarter.
 
Has anyone or can anyone on WF put this info to practical use in IM or SEO? Yes I do intend on watching the videos but just wanted to know if there were any practical applications for NN in these two domains.
 
Has anyone or can anyone on WF put this info to practical use in IM or SEO? Yes I do intend on watching the videos but just wanted to know if there were any practical applications for NN in these two domains.

I'd suggest watching the videos first. An implementation in the IM world can be many - captcha breaking, intelligent data capturing, link profile decisions, search/ranking pattern recognition, organizing data that you've scraped etc.

If you've heard of Nara - (Dallas Restaurants - Nara) - they have to be accredited for making the concept of Neural networks more mainstream. They built an automated recommendation system they called the "Nara Neural Network" - they use this data to then display recommendations to their users.
Nara Wants To Build A Better Recommendation Platform, Starts With Restaurants And A $4 Million Series A Round | TechCrunch
Nara Neural Networking Dining Personalization Service Goes Mobile, Adds Cities, And Targets New Categories With Partners - semanticweb.com

I recommend you watch the videos and then apply your existing experience to see what kind of data you can process through an NN architecture.

P.S - You will also needs lots of first hand data to process in order to increase the output's accuracy.
 
LOL @ the funnel on those videos

Part 1 --> 79
Part 2 --> 15
Part 3 --> 7
Part 4 --> 6

So to those 6, I salute you.

::emp::

Thanks for the thread :emp:, you can now make that 7 who have made it through part 4.

LOL


I was interested in ANN back in the early 1990's while it was still new and ran into the problem you mentioned in the first video.


Example: Picture recognition.


Dog and Fish Pics.

Input: Fish pic then define Fish for the output.

Input: Fish pic ANN output = ?

2 more Fish Pic inputs with confirming definitions that it is indeed a fish pic and then...

Input: Fish pic ANN output = Fish


Now we add the Dog Pic.

Input: Dog pic ANN output = ?

Input: Dog pic then define Dog for the output.

2 more Dog Pic inputs with confirming definitions that it is indeed a dog pic
and then...

Input: Dog pic ANN output = Dog


Here is where it would seem to break:


Input: Fish pic and then define it as a Dog pic one time.

Input: Fish pic ANN = Fish

Input Fish pic and define it as a dog two more times.

Input Fish pic ANN output = Fish or Dog (Now confused having equal amounts of data input that are in conflict.)

One more input of Fish pic defined as a Dog and the ANN will now output Dog for a fish pic.


The same held true for the Dog pic and telling the ANN it was now a fish.


I know this is LOGICAL, but in order to get this to work with AI as would be needed in a complex system such as say a Robot/Android it needed to be better.


I had not checked into the developments of ANN since the early 1990's and was glad you took the time to create this thread. Especially that you showed how the back propagation and weighted data input can be used in the "hidden Layer" to help quantify input data in order to reduce the noise influence and help the network maintain it's integrity.


Thank you for getting me interested in this again. I know back in the early 90's they were using this in Horse Racing and getting very promising results.

Back then they were also talking about ANN specific Hardware. Below are a couple of Blast from the Past links:

http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.7.8418&rep=rep1&type=pdf

http://yann.lecun.com/exdb/publis/pdf/boser-92a.pdf



What they've been doing since:

Artificial neural networks in hardware: A survey of two decades of progress



I'll be going to the links that you and others have put up in this thread.



Thanks again :emp: and to everyone else who posted in here.
 
lzlLtNQ.jpg
 
I have no idea what your talking about, but I like your voice. No homo


Awww....My garage door is open and my little Volkswagen now has four flat tires.


Lulz


P.S. Yes, I've now had 18 shots of Ta Kill Ya and four glasses of Chardonnay in the last 10 hours on an empty stomach. (Opps, I lied......just had like 26 Jumbo shrimp about an hour ago!! Lulz)

Hi Howdy Jeffrey!!

Lulz