Elevator Pitch
Navigate through the deep waters of knowledge graphs with an intuitive step-by-step UI, and find answers to your connected-data questions in a few guided clicks. Traverse relationships and filter items on the way to reach relevant data. Set sail from any anchor and steer as an experienced sailor.
Description
In this talk we wish to demonstrate our ideas for a new UI component for traversing and filtering Neo4j graphs in a visual and schema-assisted way, starting from any anchor node or Label. This prototype is the outcome of our research to allow end-users to intuitively find their way inside the intricate forest of highly-connected knowledge graphs. The approach we followed focuses on allowing the user to easily find answers to complex connected-data questions in a dynamic multi-step wizard, which resonates with the way of thinking of the question. Guided by the underlying knowledge-graph schema, the user is assisted in creating a traversal query step-by-step moving from a starting node or label, back and forth to the adjacent relationships or to the next connected nodes, with optional in-step attribute-based filtering and other options. As a result, in a few clicks, users find their answers and can expand relevant data inside the main graph visualisation for further exploration.
Notes
The presentation will mostly focus on the demonstration of a working prototype, and we will show a few Cypher queries. The talk is well suited for beginners in graphs, and we just need a general understanding of the connected-data challenge, which we assume is shared among all participants at the Nodes conference. It is core of our mission at GraphAware to enable the adoption of graphs and Neo4j in particular. The solution we present here is in line with our mission, and it is an important step forward towards enabling users to benefit from graph technology without specialistic skills - in this case to traverse and query. We know this theme is of interest to the broad audience of Nodes 2021 developers and their users, and we look forward to the community feedback and questions.