The common assumptions made when doing a ttest include those regarding the scale of measurement, random sampling, normality of data distribution, adequacy of sample size and equality of variance in standard deviation.
The TTest
The ttest was developed by a chemist working for the Guinness brewing company as a simple way to measure the consistent quality of stout. It was further developed and adapted, and now refers to any test of a statistical hypothesis in which the statistic being tested for is expected to correspond to a tdistribution if the null hypothesis is supported.
A ttest is an analysis of two populations means through the use of statistical examination; a ttest with two samples is commonly used with small sample sizes, testing the difference between the samples when the variances of two normal distributions are not known.
Tdistribution is basically any continuous probability distribution that arises from an estimation of the mean of a normally distributed population using a small sample size and an unknown standard deviation for the population. The null hypothesis is the default assumption that no relationship exists between two different measured phenomena. (For related reading, see: What does a strong null hypothesis mean?)
TTest Assumptions
The first assumption made regarding ttests concerns the scale of measurement. The assumption for a ttest is that the scale of measurement applied to the data collected follows a continuous or ordinal scale, such as the scores for an IQ test.
The second assumption made is that of a simple random sample, that the data is collected from a representative, randomly selected portion of the total population.
The third assumption is the data, when plotted, results in a normal distribution, bellshaped distribution curve.
The fourth assumption is a reasonably large sample size is used. A larger sample size means the distribution of results should approach a normal bellshaped curve.
The final assumption is homogeneity of variance. Homogeneous, or equal, variance exists when the standard deviations of samples are approximately equal.

When is it better to use systematic over simple random sampling?
Learn when systematic sampling is better than simple random sampling, such as in the absence of data patterns and when there ... Read Answer >> 
What are the disadvantages of using a simple random sample to approximate a larger ...
Learn what a simple random sample is, how researchers use it as a statistical tool and the disadvantages it carries when ... Read Answer >> 
What's an example of stratified random sampling?
Stratified random sampling divides a population into subgroups or strata, whereby the members in each of the stratum formed ... Read Answer >> 
How do I calculate the standard error using Matlab?
Learn how to calculate the standard error for a sample statistical measure, such as the sample mean, using standard Matlab ... Read Answer >> 
What are the pros and cons of stratified random sampling?
Stratified random sampling provides a more accurate sampling of a population, but can be disadvantageous when researchers ... Read Answer >> 
What is a relative standard error?
Find out how to distinguish between mean, standard deviation, standard error and relative standard error in statistical survey ... Read Answer >>

Trading
Trading with Gaussian models of statistics
The study of statistics originated from Carl Friedrich Gauss and helps us understand markets, prices and probabilities, among other applications. 
Investing
Stock Market Risk: Wagging The Tails
The bell curve is an excellent way to evaluate stock market risk over the long term. 
Investing
Find the right fit with probability distributions
Discover a few of the most popular probability distributions and how to calculate them. 
Investing
Target Prices: The Key to Sound Investing
Learn how to evaluate the legitimacy of target prices and why investors should trust them over ratings. 
Investing
Style Matters In Financial Modeling
If you're looking to get a job as an analyst, you'll need to know how to work it. 
Investing
Returns and Financial Planning Projections
Return expectations continue to be a necessary part of any investment strategy discussion. 
Investing
Leading Economic Indicators Predict Market Trends
Leading indicators help investors to predict and react to where the market is headed. 
Investing
Understanding The Sharpe Ratio
The Sharpe ratio describes how much excess return you are receiving for the extra volatility that you endure for holding a riskier asset. 
Investing
Lognormal and normal distribution
When and why do you use lognormal distribution or normal distribution for analyzing securities? Lognormal for stocks, normal for portfolio returns. 
Investing
How Vanguard Index Funds Work
Learn how Vanguard index funds work. See how the index sampling technique allows Vanguard to charge low expense ratios that can save investors money.

Representative Sample
A representative sample is a subset of a statistical population ... 
Central Limit Theorem  CLT
The Central Limit Theorem  CLT states that when samples from ... 
Sample Size Neglect
Sample size neglect occurs when an individual infers too much ... 
Statistical Significance
Statistical significance refers to a result that is not likely ... 
Alpha Risk
Alpha risk is the risk in a statistical test of rejecting a null ... 
Stratified Random Sampling
Stratified random sampling is a method of sampling that involves ...