02.08.2013 Views

Sample A: Cover Page of Thesis, Project, or Dissertation Proposal

Sample A: Cover Page of Thesis, Project, or Dissertation Proposal

Sample A: Cover Page of Thesis, Project, or Dissertation Proposal

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

Figure 3.3: Down selection models- Bhattacharjee predict Stearman. Clustering results summarized, f<strong>or</strong><br />

kNN, RF and LDA classifier models, where training used the Bhattacharjee gene lists and testing was then<br />

done on the Stearman data. Each graph shows the training set number on the y-axis and the cumulative<br />

AUC value on the x-axis. The last column summarizes the perf<strong>or</strong>mance at each selection level across the<br />

three alg<strong>or</strong>ithms. Each row <strong>of</strong> graphs shows outcomes f<strong>or</strong> a cleansing method (RMA, dCHIP and BAFL)<br />

followed by t-test classification f<strong>or</strong> DE results. The 4 sets <strong>of</strong> data resulting from additional selection criteria<br />

are denoted by the same col<strong>or</strong>s and are the same as described in the legend to Figure 3.2.<br />

Auth<strong>or</strong>’s List (Validation)<br />

The 325 DE ProbeSets were compared against the BaFL-passed genes in the auth<strong>or</strong>’s published<br />

lists. The validation models inc<strong>or</strong>p<strong>or</strong>ated min<strong>or</strong> perturbations, through random sampling <strong>of</strong> both<br />

the training and testing sets, over 100 iterations to achieve a reasonable measurement <strong>of</strong> the<br />

model’s classification perf<strong>or</strong>mance [6, 12, 36, 37]. Figure 3.4 presents the results <strong>of</strong> training the<br />

models using the Stearman dataset and testing on the Bhattacharjee data, and the three data<br />

80

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!