- Notifications
You must be signed in to change notification settings - Fork59
A Triple Pattern Fragments server for Node.js
License
LinkedDataFragments/Server.js
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
This repository contains modules forLinked Data Fragments (LDF) servers.
Find more information about migrating fromldf-server
2.x.xon our wiki.
On today's Web, Linked Data is published in different ways,which includedata dumps,subject pages,andresults of SPARQL queries.We call each such part aLinked Data Fragment.
The issue with the current Linked Data Fragmentsis that they are either so powerful that their servers suffer from low availability rates(as is the case with SPARQL),or either don't allow efficient querying.
Instead, this server offersQuad Pattern Fragments(a.k.a.Triple Pattern Fragments).Each Quad Pattern Fragment offers:
- data that corresponds to aquad/triple pattern(example).
- metadata that consists of the (approximate) total triple count(example).
- controls that lead to all other fragments of the same dataset(example).
An example server is available atdata.linkeddatafragments.org.
The easiest way to start using this server is via@ldf/server
.(previously known asldf-server
)
This server requiresNode.js 10.0 or higherand is tested on OSX and Linux.To install, execute:
$ [sudo] npm install -g @ldf/server
First, create a configuration fileconfig.json
similar toconfig/config-example.json
,in which you detail your data sources.For example, this configuration uses anHDT fileand a SPARQL endpoint as sources:
{"@context":"https://linkedsoftwaredependencies.org/bundles/npm/@ldf/server/^3.0.0/components/context.jsonld","@id":"urn:ldf-server:my","import":"preset-qpf:config-defaults.json","title":"My Linked Data Fragments server","datasources": [ {"@id":"ex:myHdtDatasource","@type":"HdtDatasource","datasourceTitle":"DBpedia 2014","description":"DBpedia 2014 with an HDT back-end","datasourcePath":"dbpedia","hdtFile":"data/dbpedia2014.hdt" }, {"@id":"ex:mySparqlDatasource","@type":"SparqlDatasource","datasourceTitle":"DBpedia (Virtuoso)","description":"DBpedia with a Virtuoso back-end","datasourcePath":"dbpedia-sparql","sparqlEndpoint":"https://dbpedia.org/sparql" } ]}
More details on how to configure this server can be found in the README of@ldf/server
.
After creating a configuration file, execute
$ ldf-server config.json 5000 4
Here,5000
is the HTTP port on which the server will listen,and4
the number of worker processes.
Now visithttp://localhost:5000/
in your browser.
This repository should be used by LDF Server moduledevelopers as it contains multiple LDF Server modules that can be composed.We manage this repository as amonorepousingLerna.
The following modules are available:
@ldf/core
: Shared functionality for LDF servers.@ldf/server
: An LDF server with Quad/Triple Pattern Fragments support.@ldf/preset-qpf
: Configuration presets for Quad/Triple Pattern Fragments servers.@ldf/feature-qpf
: Feature that enablesQuad Pattern Fragments (a.k.a.Triple Pattern Fragments).@ldf/feature-summary
: Feature that adds summaries to datasources.@ldf/feature-memento
: Feature that enables datetime negotiation using theMemento Protocol.@ldf/feature-webid
: Feature that enables authenticated requests from clients with WebID.@ldf/datasource-hdt
: Datasource that allows HDT files to be loaded.@ldf/datasource-jsonld
: Datasource that allows JSON-LD files to be loaded.@ldf/datasource-rdfa
: Datasource that allows RDFa files to be loaded.@ldf/datasource-n3
: Datasource that allowsN-Quads,N-Triples,Trig andTurtle files to be loaded.@ldf/datasource-sparql
: Datasource that allows SPARQL endpoints to be used as a data proxy.@ldf/datasource-composite
: Datasource that delegates queries to an sequence of other datasources.
These modules can be used to configure your own LDF server with the features you want.As an example on how to make such a server,you can have a look at@ldf/server
,which is a default server configuration that has all possible features enabled.
If you want to develop new featuresor use the (potentially unstable) in-development version,you can set up a development environment for this server.
LDF Server requiresNode.JS 10.0 or higher and theYarn package manager.LDF Server is tested on OSX, Linux and Windows.
This project can be setup by cloning and installing it as follows:
$ git clone https://github.com/LinkedDataFragments/Server.js.git$cd Server.js$ yarn install
Note:npm install
is not supported at the moment, as this project makes use of Yarn'sworkspaces functionality
This will install the dependencies of all modules, and bootstrap the Lerna monorepo.After that, allLDF Server packages are available in thepackages/
folderand can be used in a development environment.
Furthermore, this will addpre-commit hooksto build, lint and test.These hooks can temporarily be disabled at your own risk by adding the-n
flag to the commit command.
The Linked Data Fragments server is written byRuben Verborgh, Miel Vander Sande,Ruben Taelman and colleagues.
This code is copyrighted byGhent University – imecand released under theMIT license.
About
A Triple Pattern Fragments server for Node.js