Skip to main content

Probability Theory

Following "A concise Course in Statistical Inference", I will start summarizing the main ideas.

Data analysis, pattern recognition, machine learning, data mining are general terms that we have seen a lot, which indeed refer to special cases of statistical inference. Particular problems are classification, prediction, clustering, estimation.

- In Probability, we study: Given a data generating process, what are the properties of the outcome?
- In Statistical Inference, we study: Given the outcomes, what can we say about the process that generated the data?

This first entry will devote for "Probability".

Probability theory, or probability distribution/density is a function that describes the probability of a random variable taking certain value. There are two kinds:

- Discrete probability distribution (or probability on finite sample spaces): characterized by a probability mass function
uniform probability distribution: if the sample space is finite and each outcome is equally likely.

- Continuous probability distribution, i.e., a distribution of a continuous random variable X, is a probability distribution that has a probability density function. For example: normal, uniform, chi-squared distribution.

Comments

Popular posts from this blog

Pytorch and Keras cheat sheets

Sigmoid, tanh, ReLU functions. What are they and when to use which?

If you are working on Deep Learning or Machine Learning in general, you have heard of these three functions quite frequently. We know that they can all be used as activation functions in neural networks. But what are these functions and why do people use for example ReLU in this part, sigmoid in another part and so on? Here is a friendly introduction to these functions and a brief explanation of when to use which. Sigmoid function Output from 0 to 1 Exponential computation (hence, slow) Is usually used for binary classification (when output is 0 or 1) Almost never used (e.g., tanh is a better option) Tanh function A rescaled logistic sigmoid function (center at 0) Exponential computation Works better than sigmoid ReLU function (Rectified Linear Unit) and its variants Faster to compute Often used as default for activation function in hidden layers ReLU is a simple model which gives 0 value to all W*x + b < 0. The importance is that it introduces t...

Python Tkinter: Changing background images using key press

Let's write a simple Python application that changes its background image everytime you click on it. Here is a short code that helps you do that: import os, sys import Tkinter import Image, ImageTk def key(event): print "pressed", repr(event.char) event.widget.quit() root = Tkinter.Tk() root.bind_all(' ', key) root.geometry('+%d+%d' % (100,100)) dirlist = os.listdir('.') old_label_image = None for f in dirlist: try: image1 = Image.open(f) root.geometry('%dx%d' % (image1.size[0],image1.size[1])) tkpi = ImageTk.PhotoImage(image1) label_image = Tkinter.Label(root, image=tkpi) label_image.place(x=0,y=0,width=image1.size[0],height=image1.size[1]) root.title(f) if old_label_image is not None: old_label_image.destroy() old_label_image = label_image root.mainloop() # wait until user clicks the window except Exception, e: # Skip a...