Top PDF New results on word-representable graphs

New results on word-representable graphs

New results on word-representable graphs

A graph G = (V, E) is word-representable if there exists a word w over the alphabet V such that letters x and y alternate in w if and only if (x, y) ∈ E for each x 6= y . The set of word-representable graphs generalizes several important and well-studied graph families, such as circle graphs, compa- rability graphs, 3-colorable graphs, graphs of vertex degree at most 3, etc. By answering an open question from [9], in the present paper we show that not all graphs of vertex degree at most 4 are word-representable. Combining this result with some previously known facts, we derive that the number of n -vertex word-representable graphs is 2 n 3 2 +o(n
Show more

8 Read more

Solving computational problems in the theory of word-representable graphs

Solving computational problems in the theory of word-representable graphs

we raise some concerns about Conjecture 7, while confirming it for graphs on at most 9 vertices. In Section 3 we present a complementary computational approach using constraint programming, enabling us count connected non-word-representable graphs. In particular, in Section 3 we report that using 3 years of CPU time, we found out that 64.65% of all connected graphs on 11 vertices are non-word-representable. Another important corollary of our results in Section 3 is the correction of the published result [19, 20] on the number of connected non- word-representable graphs on 9 vertices (see Table 2). In Section 4 we introduce the notion of a k-semi-transitive orientation refining the notion of a semi-transitive orientation, and show that 3-semi-transitively orientable graphs are not necessarily semi-transitively orientable. Finally, in Section 5 we suggest a few directions for further research and experimentation.
Show more

18 Read more

Word Representability of Line Graphs

Word Representability of Line Graphs

A graph G =  V E ,  is representable if there exists a word W over the alphabet V such that letters x and y alternate in W if and only if   x y , is in E for each x not equal to . The motivation to study representable graphs came from algebra, but this subject is interesting from graph theoretical, com- puter science, and combinatorics on words points of view. In this paper, we prove that for greater than 3, the line graph of an -wheel is non-representable. This not only provides a new construction of non-repre- sentable graphs, but also answers an open question on representability of the line graph of the 5-wheel, the minimal non-representable graph. Moreover, we show that for greater than 4, the line graph of the com- plete graph is also non-representable. We then use these facts to prove that given a graph which is not a cycle, a path or a claw graph, the graph obtained by taking the line graph of -times is guaranteed to be non-representable for greater than 3.
Show more

6 Read more

Word-representability of triangulations of grid-covered cylinder graphs

Word-representability of triangulations of grid-covered cylinder graphs

In this paper we extend the results of Akrobotu, Kitaev and Mas´ arov´ a [1] to the case of grid-covered cylinder graphs, which is a cyclic version of rect- angular grid graphs; see Subsection 2.2 for definitions. It turns out that in this case, some of the graphs in question with chromatic number 4 are actu- ally word-representable; for example, see the underlying graph in Figure 3.7. Still, assuming that there are at least four sectors in a grid-covered cylinder graph, word-representable triangulations of such graphs are characterized by avoidance of W 5 and W 7 as induced subgraphs. On the other hand, we
Show more

19 Read more

Co-occurrence graphs for word sense disambiguation in the biomedical domain

Co-occurrence graphs for word sense disambiguation in the biomedical domain

Word Sense Disambiguation is a key step for many Natural Language Processing tasks (e.g. sum- marization, text classification, relation extraction) and presents a challenge to any system that aims to process documents from the biomedical domain. In this paper, we present a new graph- based unsupervised technique to address this problem. The knowledge base used in this work is a graph built with co-occurrence information from medical concepts found in scientific abstracts, and hence adapted to the specific domain. Unlike other unsupervised approaches based on static graphs such as UMLS, in this work the knowledge base takes the context of the ambiguous terms into account. Abstracts downloaded from PubMed are used for building the graph and disam- biguation is performed using the Personalized PageRank algorithm. Evaluation is carried out over two test datasets widely explored in the literature. Different parameters of the system are also evaluated to test robustness and scalability. Results show that the system is able to outperform state-of-the-art knowledge-based systems, obtaining more than 10% of accuracy improvement in some cases, while only requiring minimal external resources.
Show more

22 Read more

On k-11-representable graphs

On k-11-representable graphs

The paper is organized as follows. In the rest of the section, we give more details about word-representable graphs. In Section 2, we introduce rigorously the notion of a k-11- representable graph and provide a number of general results on these graphs. In particular, we show that a (k − 1)-11-representable graph is necessarily k-11-representable (see Theo- rem 2.2). In Section 3, we study the class of 1-11-representable graphs. These studies are extended in Section 4, where we 1-11-represent all non-word-representable graphs on at most 7 vertices. In Section 5 we prove that any graph is 2-11-representable. Finally, in Section 6, we state a number of open problems on k-11-representable graphs.
Show more

19 Read more

Word-representability of face subdivisions of triangular grid graphs

Word-representability of face subdivisions of triangular grid graphs

Recently, a number of (fundamental) results on word-representable graphs were obtained in the literature; for example, see [1], [3], [5], [7], [9], [11], and [12]. In particular, Halld´ orsson et al. [7] have shown that a graph is word- representable if and only if it admits a semi-transitive orientation (to be de- fined in Section 2), which, among other important corollaries, implies that all 3-colorable graphs are word-representable. The theory of word-representable graphs is the main subject of the upcoming book [8].
Show more

17 Read more

Can Syntactic and Logical Graphs help Word Sense Disambiguation?

Can Syntactic and Logical Graphs help Word Sense Disambiguation?

The task of word sense disambiguation (WSD) can be regarded as one of the most important tasks for natural language processing applications including semantic interpretation of texts, semantic web applications, paraphrasing and summarization. One issue of current word sense disambiguation methods is that the most successful techniques are supervised, which means that annotated corpora should be available to train the systems. However, this kind of data is heavy to produce and cannot be created for each new domain to be disambiguated. This indicates that more efforts should be put on unsupervised word sense disambiguation techniques. Furthermore, one vital issue that should generally be solved for this kind of systems is the choice of an adequate context. Usually, this context is defined as a window of words or sentences around the word to be disambiguated. The question raised by this paper is whether defining this context using syntactic and logical features can be beneficial to WSD. This paper briefly presents a natural language processing pipeline that outputs logical representations from texts and disambiguates the logical representations using various WSD algorithms. The paper also presents different context definitions that are used for WSD. Preliminary results show that logical and syntactic features can be of interest to WSD. The main contribution of this paper is the use of syntactic and semantic information for WSD in an unsupervised manner. The paper is organized as follows: First, section 2 explains the pipeline that creates logical representations and presents the various WSD algorithms and the contexts used in this study. Section 3 presents experiments that are conducted over a small corpus and shows preliminary results. It also describes the results of our system on the Senseval English lexical Sample Task before drawing a conclusion.
Show more

7 Read more

Solving computational problems in the theory of word representable graphs

Solving computational problems in the theory of word representable graphs

we raise some concerns about Conjecture 7, while confirming it for graphs on at most 9 vertices. In Section 3 we present a complementary computational approach using constraint programming, enabling us count connected non-word-representable graphs. In particular, in Section 3 we report that using 3 years of CPU time, we found out that 64.65% of all connected graphs on 11 vertices are non-word-representable. Another important corollary of our results in Section 3 is the correction of the published result [19, 20] on the number of connected non- word-representable graphs on 9 vertices (see Table 2). In Section 4 we introduce the notion of a k-semi-transitive orientation refining the notion of a semi-transitive orientation, and show that 3-semi-transitively orientable graphs are not necessarily semi-transitively orientable. Finally, in Section 5 we suggest a few directions for further research and experimentation.
Show more

18 Read more

Using Graphs for Word Embedding with Enhanced Semantic Relations

Using Graphs for Word Embedding with Enhanced Semantic Relations

In this paper, we present WordGraph2Vec, a word embedding algorithm with semantic enhancement. The algorithm makes use of both linear and graph input in order to strengthen the semantic relations between words. Our experimental results show that the proposed embedding did not achieve the best results on analogy and classification tasks but was stable across the datasets and in most cases was ranked at the second place in terms of docu- ment classification and analogy tests accuracy. In future work, further settings of WordGraph2Vec can be explored, such as additional word graph configurations and a larger radius R for the new target words, which should yield target words that are not close to the context word in the original text. In addition, the proposed graph-based ap- proach to word embedding can be evaluated on other NLP tasks in multiple languages.
Show more

10 Read more

Semi-transitive orientations and word-representable graphs

Semi-transitive orientations and word-representable graphs

Organization of the paper. The paper is organized as follows. In Sec- tion 2, we give definitions of objects of interest and review some of the known results. In Section 3, we give a characterization of word-representable graphs in terms of orientations and discuss some important corollaries of this fact. In Section 4, we examine the representation number, and show that it is always at most 2n − 4, but can be as much as n/2. We explore, in Section 5, which classes of graphs are word-representable, and show, in particular, that 3-colorable graphs are such graphs, but numerous other properties are inde- pendent from the property of being word-representable. Finally, we conclude with two open problems in Section 6.
Show more

14 Read more

Language classification from bilingual word embedding graphs

Language classification from bilingual word embedding graphs

While multilingual word vectors have been evaluated with respect to intrinsic parameters such as embedding dimensionality, empirical work on another aspect appears to be lacking: the second language involved. For example, it might be the case that projecting two languages with very different lexical semantic associations in a joint embedding space inherently deteriorates monolingual embeddings as measured by performance on an intrinsic monolingual semantic evaluation task, relative to a setting in which the two languages have very similar lexical semantic associations. To illustrate, the classical Latin word vir is sometimes translated in English as both ‘man’ and ‘warrior’, suggesting a semantic connotation, in Latin, that is putatively lacking in English. Hence, projecting English and Latin in a joint semantic space may invoke semantic relations that are misleading for an English evaluation task. Alternatively, it may be argued that heterogeneity in semantics between the two languages involved is beneficial for monolingual evaluation tasks in the same way that uncorrelatedness in classifiers helps in combining them.
Show more

12 Read more

The Results on Vertex Domination in Fuzzy Graphs

The Results on Vertex Domination in Fuzzy Graphs

and complete bipartite fuzzy graphs. The bounds is obtained for the vertex domination number of fuzzy graphs. Also the relationship between M -strong arcs and α-strong is obtained. In fuzzy graphs, monotone decreasing property and monotone increasing property is introduced. We prove the vizing’s conjecture is monotone decreasing fuzzy graph property for vertex domination. we prove also the Grarier-Khelladi’s conjecture is monotone decreasing fuzzy graph property for it. We obtain Nordhaus-Gaddum (NG) type results for these parameters. The relationship between several classes of operations on fuzzy graphs with the vertex domination number of them is studied.
Show more

20 Read more

More results on eccentric coloring in graphs

More results on eccentric coloring in graphs

Unless mentioned otherwise for terminology and notation the reader may refer Buckley and Harary [2] and Chartrand and Lensiak [3], new ones will be introduced as and when found necessary. In this paper we consider simple undirected graphs without multiple edges and self loops. The order p is the number of vertices in G and size q is the number of edges in G. The distance d(u, v) between u and v is the length of a shortest path joining u and v. If there exists no path between u and v then we define d(u, v) = ∞. The eccentricity e(u) of u is the distance to a vertex farthest from u. If d(u, v) = e(u)(v 6= u), we say that v is an eccentric vertex of u. The radius rad(G) is the minimum eccentricity of the vertices, where as the diameter diam(G) is the maximum eccentricity. A vertex v is a central vertex if e(v) = rad(G), and the center C(G) is the set of all central vertices. A graph G is self-centered if rad(G) = diam(G). The join of two graphs G 1 and G 2 , defined by Zykov [8], is denoted G 1 + G 2 and consists of
Show more

14 Read more

SOME RESULTS ON BIPOLAR FUZZY GRAPHS

SOME RESULTS ON BIPOLAR FUZZY GRAPHS

[2]M.Akram, Bipolar fuzzy graphs, Information sciences,DOI 10.1016/j.ins 2011.07.037,2011. [3] A.Nagoorgani,K.Radha, On regular fuzzy graphs, journal of physical sciences,Vol.12,33-44,2008. [4] A.Nagoorgani and j.Malarvizhi properties of 𝜇-complement of a fuzzy graph, international journal of algorithms, Computing and Mathematics, Vol.2,No.3,73-83, 2009.

8 Read more

Further Results on Sum Cordial Graphs

Further Results on Sum Cordial Graphs

All graphs G = ( V ( G ) , E ( G )) in this paper are finite, connected and undirected. For any undefined nota- tions and terminology we follow [3]. If the vertices or edges or both of the graph are assigned valued subject to certain conditions it is known as graph labeling. A dynamic survey on graph labeling is regularly updated by Gallian [4]. Labeled graphs have variety of applications in graph theory, particularly for missile guidance code, design good radar type codes and convolution codes with optimal autocorrelation properties. Labeled graphs plays vital role in the study of X-ray crystallography, communication network and to determine opti- mal circuit layouts. A detailed study on variety of applications on graph labeling is carried out in Bloom and Golomb [1].
Show more

7 Read more

p representable operators in Banach spaces

p representable operators in Banach spaces

Let LE,F be the space of all bounded linear operators from E into F and BE* the unit ball of E*, the dual of E The completion of the injective tensor product of E and F is denoted by E F[r]

6 Read more

The equational theories of representable residuated semigroups

The equational theories of representable residuated semigroups

Residuated algebras and their equational theories have been investigated on their own right and also in connection with substructural logics. The reason for the latter is that the algebraizations of substructural logics like relevance logic [AB75, ABD92] and the Lambek calculus (LC) [La58] yield residuated algebras. Indeed, for these logics, the Lindenbaum–Tarski alge- bras are residuated algebras and sound relational semantics can be provided using families of binary relations, i.e., representable residuated algebras. These connections are explained in detail in [Mik??] and the references therein. In particular, we show in [Mik??] completeness of an expansion of LC with meet w.r.t. binary relational semantics. This completeness re- sult states that that derivability in LC augmented with derivation rules for meet coincides with semantic validity, i.e., completeness is stated in its weak form and does not capture general semantic consequence. The proof uses cut-elimination. In algebraic terms this result means that the equational theories of abstract (related to the syntactic calculus) and representable (related to binary semantics) algebras coincide. In other words, the free abstract algebra is representable.
Show more

8 Read more

Relation Prediction for Unseen Entities Using Entity Word Graphs

Relation Prediction for Unseen Entities Using Entity Word Graphs

the Entity-Word graph, even Unseen-entities can explicitly be connected with other entities. Our method encodes the graph with Graph Convolu- tional Networks (GCNs) (Kipf and Welling, 2017) to learn entity representations considering the global features of the entire graph. GCNs simplify the convolutional operations on the graph, and learn node representations based on their neigh- borhood information. GCNs are utilized for sev- eral NLP tasks (Zhang et al., 2018; De Cao et al., 2019). By encoding the Entity-Word graph with GCNs, not only the descriptions information but also information of the related entities is propa- gated to the Unseen-entities through words. We expect that the entity representations learned via our Entity-Word graph can contribute to the im- provement in the performance of the KGC.
Show more

6 Read more

Word Sense Induction & Disambiguation Using Hierarchical Random Graphs

Word Sense Induction & Disambiguation Using Hierarchical Random Graphs

Another graph-based method is presented in (Dorow and Widdows, 2003). They extract only noun neighbours that appear in conjunctions or dis- junctions with the target word. Additionally, they extract second-order co-occurrences. Nouns are rep- resented as vertices, while edges between vertices are drawn, if their associated nouns co-occur in con- junctions or disjunctions more than a given num- ber of times. This co-occurrence frequency is also used to weight the edges. The resulting graph is then pruned by removing the target word and ver- tices with a low degree. Finally, the MCL algorithm (Dongen, 2000) is used to cluster the graph and pro- duce a set of clusters (senses) each one consisting of a set of contextually related words.
Show more

11 Read more

Show all 10000 documents...