A Markov Chain is a simple state machine. The machine can be in one of several states and from each state, there is a set probability (possibly zero) of transitioning to each other state. Consider a baseball game: One state might be two outs, runner on second. After the next at-bat, the state might change to three outs, or to two outs, runners on first and second. The probability of the state changing to one out, nobody on base is zero.
A Hidden Markov Model is one where the actual transition probabilities are unknown, and must be inferred. You might, for example, observe hundreds of baseball games and determine that, out of 100 times there were two outs, runner on second, the batter made an out 63 times. Thus, we infer that the transition probability, which is unknown, is probably 63%.