What is in a Thesaurus: •
Preferred terms: the thesaurus indicates which terms the indexer or searcher is allowed to use. These terms are called the preferred terms. Those terms are marked as descriptors. Preferred terms serve as focal points where all the information about a concept is collected.
•
None-preferred terms: the terms not allowed to be used in addition to the preferred terms. One of two or more synonyms or lexical variants that serves as an entry term. A thesaurus also usually allows you to look up a preferred term and see its non-preferred terms. This can give you a better idea of what the term is supposed to mean. For the EXAMPLE, all non-preffered terms were put into Italics. Non-preferred terms are included in a thesaurus mainly to help users find the appropriate preferred terms. Non-preferred terms may also help to define the scope of preferred terms.
•
Semantic relations: links between different terms. Explain the relation of the meaning of a term. See describtion below.
• Example: •
• • • • • • • • • • • • • • • • • •
• •
LANDSLIDES (preferred term) SN: a slide of a large mass of dirt and rock down a mountain or hill,etc. UF: FALLS (non-preffered terms) TOPPLES LATERAL SPREADS FLOWS BLOCK SLIDES TALUS FLOWS BT: NATURAL HAZARDS NT: SILT FLOW SLIDES CLAY FLOW SLIDES LOESS FLOW SLIDES DRY SAND FLOWS EARTH FLOWS MUD FLOWS DEBRIS AVALANCHES DEBRIS FLOWS DEBRIS FLOODS ROCK AVALANCHES RT: SOLIFLUCTION SOIL CREEP
• ROTATIONAL SLIDES • TRANSLATIONAL SLIDES • GLT: BERGRUTSCH • FLT: ÉBOULEMENT • ILT: FRANA • SLT: Broader terms represent more general concepts, and narrower terms more specific concepts.
Hierarchical relationships are based on degrees or levels of superordination and subordination, where the superordinate term represents a class or a whole, and subordinate terms refer to its members or parts. Reciprocity should be expressed by the following relationship indicators: • BT (Broader Term), a label for the superordinate term • NT (Narrower Term), a label for the subordinate term Example 101: Hierarchical relationship notation (BT and NT) mammals BT vertebrates vertebrates NT mammals Associative relationship The relationship between two concepts having a non-hierarchical thematic connection based on spatial or temporal proximity, such as the relationship between a container and its contents, an activity and the tool used to perform the activity, a cause and its effect, a producer and its product, an organization and the building in which it is located, etc.
Disambiguation in Wikipedia is the process of resolving conflicts in Wikipedia article titles that occur when a single term can be associated with more than one topic, making that term likely to be the natural title for more than one article. In other words, disambiguations are paths leading to different articles which could, in principle, have the same title. For example, the word "Mercury" can refer to several different things, including an element, a planet, an automobile brand, a record label, a NASA manned-spaceflight project, a plant, and a Roman god. Since only one Wikipedia page can have the generic name "Mercury", unambiguous article titles are used for each of these topics: Mercury (element), Mercury (planet), Mercury (automobile), Mercury Records, Project Mercury, Mercury (plant), Mercury (mythology). There must then be a way to direct the reader to the correct specific article when an ambiguous term is referenced by linking, browsing or searching; this is what is known as disambiguation. In this case it is achieved using Mercury as a disambiguation page Vectors
Let the field K be the set R of real numbers, and let the vector space V be the Euclidean space R3. Consider the vectors e1 := (1,0,0), e2 := (0,1,0) and e3 = (0,0,1). Then any vector in R3 is a linear combination of e1, e2 and e3. To see that this is so, take an arbitrary vector (a1,a2,a3) in R3, and write:
A linear classifier is often used in situations where the speed of classification is an issue, since it is often the fastest classifier, especially when is sparse. However, decision trees can be faster. Also, linear classifiers often work very well when the number of dimensions in
is large, as in document
classification, where each element in is typically the number of occurrences of a word in a document (see document-term matrix). In such cases, the classifier should be well-regularized. Obviously con_
http://www.comp.hkbu.edu.hk/~cib/2008/IIB08Nov/feature_article_%201/IIBWi ki.pdf