Evaluating Term Extraction Methods for Domain Analysis
This study compared the vocabularies created by various domain experts and the source documents selected by them to create the vocabulary. The results indicate that there is similarity among the vocabularies created and the source documents selected. Also, the relationship between the overlap scores...
Main Author: | |
---|---|
Other Authors: | |
Format: | Others |
Published: |
Virginia Tech
2014
|
Subjects: | |
Online Access: | http://hdl.handle.net/10919/34588 http://scholar.lib.vt.edu/theses/available/etd-08162010-221748/ |
id |
ndltd-VTETD-oai-vtechworks.lib.vt.edu-10919-34588 |
---|---|
record_format |
oai_dc |
spelling |
ndltd-VTETD-oai-vtechworks.lib.vt.edu-10919-345882020-09-26T05:37:55Z Evaluating Term Extraction Methods for Domain Analysis Nemallapudi, Chaitanya Computer Science Frakes, William B. Chen, Ing-Ray Kulczycki, Gregory W. Term extraction Vocabulary Overlap Domain engineering DARE This study compared the vocabularies created by various domain experts and the source documents selected by them to create the vocabulary. The results indicate that there is similarity among the vocabularies created and the source documents selected. Also, the relationship between the overlap scores of vocabularies created and overlap scores of source documents selected was tested and it was observed that no significant relation exists between them. In addition, the variability of the overlap scores of the vocabularies generated automatically to the variability of the overlap scores of those produced manually by domain experts was evaluated. The results suggested that these vocabularies are significantly different from each other. Master of Science 2014-03-14T20:43:36Z 2014-03-14T20:43:36Z 2010-08-02 2010-08-16 2010-09-02 2010-09-02 Thesis etd-08162010-221748 http://hdl.handle.net/10919/34588 http://scholar.lib.vt.edu/theses/available/etd-08162010-221748/ VT_Fair_Use_Analysis_Results_Table_1.pdf IRB_Certification.pdf Chaitanya_N_T_2010.pdf VT_Fair_Use_Analysis_Results_Figure_2.pdf In Copyright http://rightsstatements.org/vocab/InC/1.0/ application/pdf application/pdf application/pdf application/pdf Virginia Tech |
collection |
NDLTD |
format |
Others
|
sources |
NDLTD |
topic |
Term extraction Vocabulary Overlap Domain engineering DARE |
spellingShingle |
Term extraction Vocabulary Overlap Domain engineering DARE Nemallapudi, Chaitanya Evaluating Term Extraction Methods for Domain Analysis |
description |
This study compared the vocabularies created by various domain experts and the source documents selected by them to create the vocabulary. The results indicate that there is similarity among the vocabularies created and the source documents selected. Also, the relationship between the overlap scores of vocabularies created and overlap scores of source documents selected was tested and it was observed that no significant relation exists between them. In addition, the variability of the overlap scores of the vocabularies generated automatically to the variability of the overlap scores of those produced manually by domain experts was evaluated. The results suggested that these vocabularies are significantly different from each other. === Master of Science |
author2 |
Computer Science |
author_facet |
Computer Science Nemallapudi, Chaitanya |
author |
Nemallapudi, Chaitanya |
author_sort |
Nemallapudi, Chaitanya |
title |
Evaluating Term Extraction Methods for Domain Analysis |
title_short |
Evaluating Term Extraction Methods for Domain Analysis |
title_full |
Evaluating Term Extraction Methods for Domain Analysis |
title_fullStr |
Evaluating Term Extraction Methods for Domain Analysis |
title_full_unstemmed |
Evaluating Term Extraction Methods for Domain Analysis |
title_sort |
evaluating term extraction methods for domain analysis |
publisher |
Virginia Tech |
publishDate |
2014 |
url |
http://hdl.handle.net/10919/34588 http://scholar.lib.vt.edu/theses/available/etd-08162010-221748/ |
work_keys_str_mv |
AT nemallapudichaitanya evaluatingtermextractionmethodsfordomainanalysis |
_version_ |
1719342771153666048 |