sparqlwrapper

A wrapper for a remote SPARQL endpoint

https://github.com/rdflib/sparqlwrapper

Science Score: 26.0%

This score indicates how likely this project is to be science-related based on various indicators:

  • CITATION.cff file
  • codemeta.json file
    Found codemeta.json file
  • .zenodo.json file
    Found .zenodo.json file
  • DOI references
  • Academic publication links
  • Committers with academic emails
  • Institutional organization owner
  • JOSS paper metadata
  • Scientific vocabulary similarity
    Low similarity (14.1%) to scientific vocabulary

Keywords

pypi python rdf sparql sparql-endpoints sparql-query wrapper

Keywords from Contributors

linked-data semantic-web rdflib uri turtle-rdf turtle serializer rdf-xml ntriples nquads
Last synced: 6 months ago · JSON representation

Repository

A wrapper for a remote SPARQL endpoint

Basic Info
Statistics
  • Stars: 548
  • Watchers: 34
  • Forks: 125
  • Open Issues: 65
  • Releases: 19
Topics
pypi python rdf sparql sparql-endpoints sparql-query wrapper
Created over 12 years ago · Last pushed 10 months ago
Metadata Files
Readme Changelog License Authors

README.rst

.. image:: docs/source/SPARQLWrapper-250.png

=======================================
SPARQL Endpoint interface to Python
=======================================

|Build Status| |PyPi version|

* About_
* `Installation & Distribution`_
* `How to use`_
* `SPARQL Endpoint Implementations`_
* `Development`_


About
=====

**SPARQLWrapper** is a simple Python wrapper around a `SPARQL `_ service to
remotely execute your queries. It helps by creating the query
invocation and, optionally, converting the result into more manageable
formats.

Installation & Distribution
===========================

You can install SPARQLWrapper from PyPI::

   $ pip install sparqlwrapper

You can install SPARQLWrapper from GitHub::

   $ pip install git+https://github.com/rdflib/sparqlwrapper#egg=sparqlwrapper

You can install SPARQLWrapper from Debian::

   $ sudo apt-get install python-sparqlwrapper

.. note::

   Be aware that there could be a gap between the latest version of SPARQLWrapper
   and the version available as Debian package.

Also, the source code of the package can be downloaded
in ``.zip`` and ``.tar.gz`` formats from `GitHub SPARQLWrapper releases `_.
Documentation is included in the distribution.


How to use
==========

You can use SPARQLWrapper either as a Python command line script or as a Python package.

Command Line Script
-------------------

To use as a command line script, you will need to install SPARQLWrapper and then
a command line script called ``rqw`` (spaRQl Wrapper) will be available within the
Python environment into which it is installed. run ``$ rql -h`` to see all the
script's options.

Python package
--------------

Here are a series of examples of different queries executed via SPARQLWrapper
as a python package.

SELECT examples
^^^^^^^^^^^^^^^

Simple use of this module is as follows where a live SPARQL endpoint is given and the JSON return format is used:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper, JSON

    sparql = SPARQLWrapper(
        "http://vocabs.ardc.edu.au/repository/api/sparql/"
        "csiro_international-chronostratigraphic-chart_geologic-time-scale-2020"
    )
    sparql.setReturnFormat(JSON)

    # gets the first 3 geological ages
    # from a Geological Timescale database,
    # via a SPARQL endpoint
    sparql.setQuery("""
        PREFIX gts: 

        SELECT *
        WHERE {
            ?a a gts:Age .
        }
        ORDER BY ?a
        LIMIT 3
        """
    )

    try:
        ret = sparql.queryAndConvert()

        for r in ret["results"]["bindings"]:
            print(r)
    except Exception as e:
        print(e)


This should print out something like this::

    {'a': {'type': 'uri', 'value': 'http://resource.geosciml.org/classifier/ics/ischart/Aalenian'}}
    {'a': {'type': 'uri', 'value': 'http://resource.geosciml.org/classifier/ics/ischart/Aeronian'}}
    {'a': {'type': 'uri', 'value': 'http://resource.geosciml.org/classifier/ics/ischart/Albian'}}


The above result is the response from the given endpoint, retrieved in JSON, and converted to a
Python object, ``ret``, which is then iterated over and printed.

ASK example
^^^^^^^^^^^

This query gets a boolean response from DBPedia's SPARQL endpoint:

.. code-block:: python

   from SPARQLWrapper import SPARQLWrapper, XML

   sparql = SPARQLWrapper("http://dbpedia.org/sparql")
   sparql.setQuery("""
       ASK WHERE {
            rdfs:label "Asturias"@es
       }
   """)
   sparql.setReturnFormat(XML)
   results = sparql.query().convert()
   print(results.toxml())


You should see something like:

.. code-block::

    
    
    
        true
    


CONSTRUCT example
^^^^^^^^^^^^^^^^^

CONSTRUCT queries return RDF, so ``queryAndConvert()`` here produces an
RDFlib ``Graph`` object which is then serialized to the Turtle format
for printing:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper

    sparql = SPARQLWrapper("http://dbpedia.org/sparql")

    sparql.setQuery("""
        PREFIX dbo: 
        PREFIX sdo: 

        CONSTRUCT {
          ?lang a sdo:Language ;
          sdo:alternateName ?iso6391Code .
        }
        WHERE {
          ?lang a dbo:Language ;
          dbo:iso6391Code ?iso6391Code .
          FILTER (STRLEN(?iso6391Code)=2) # to filter out non-valid values
        }
        LIMIT 3
    """)

    results = sparql.queryAndConvert()
    print(results.serialize())


Results from this query should look something like this:

.. code-block::

    @prefix schema:  .

     a schema:Language ;
        schema:alternateName "ar" .

     a schema:Language ;
        schema:alternateName "an" .

     a schema:Language ;
        schema:alternateName "es" .


DESCRIBE example
^^^^^^^^^^^^^^^^

Like CONSTRUCT queries, DESCRIBE queries also produce RDF results, so this
example produces an RDFlib ``Graph`` object which is then serialized into
the JSON-LD format and printed:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper

    sparql = SPARQLWrapper("http://dbpedia.org/sparql")
    sparql.setQuery("DESCRIBE ")

    results = sparql.queryAndConvert()
    print(results.serialize(format="json-ld"))


The result for this example is large but starts something like this:

.. code-block::

    [
        {
            "@id": "http://dbpedia.org/resource/Mazonovo",
            "http://dbpedia.org/ontology/subdivision": [
                {
                    "@id": "http://dbpedia.org/resource/Asturias"
                }
        ],
    ...

SPARQL UPDATE example
^^^^^^^^^^^^^^^^^^^^^

UPDATE queries write changes to a SPARQL endpoint, so we can't easily show
a working example here. However, if ``https://example.org/sparql`` really
was a working SPARQL endpoint that allowed updates, the following code
might work:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper, POST, DIGEST

    sparql = SPARQLWrapper("https://example.org/sparql")
    sparql.setHTTPAuth(DIGEST)
    sparql.setCredentials("some-login", "some-password")
    sparql.setMethod(POST)

    sparql.setQuery("""
        PREFIX dbp:  
        PREFIX rdfs: 

        WITH 
        DELETE {
           dbo:Asturias rdfs:label "Asturies"@ast
        }
        """
    )

    results = sparql.query()
    print results.response.read()


If the above code really worked, it would delete the triple
``dbo:Asturias rdfs:label "Asturies"@ast`` from the graph
``http://example.graph``.


SPARQLWrapper2 example
^^^^^^^^^^^^^^^^^^^^^^

There is also a ``SPARQLWrapper2`` class that works with JSON SELECT
results only and wraps the results to make processing of average queries
even simpler.

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper2

    sparql = SPARQLWrapper2("http://dbpedia.org/sparql")
    sparql.setQuery("""
        PREFIX dbp:  
        PREFIX rdfs: 

        SELECT ?label
        WHERE {
            dbp:Asturias rdfs:label ?label
        }
        LIMIT 3
        """
                    )

    for result in sparql.query().bindings:
        print(f"{result['label'].lang}, {result['label'].value}")

The above should print out something like:

.. code-block::

    en, Asturias
    ar, أشتورية
    ca, Astúries


Return formats
--------------

The expected return formats differs per query type (``SELECT``, ``ASK``, ``CONSTRUCT``, ``DESCRIBE``...).

.. note:: From the `SPARQL specification `_,
  *The response body of a successful query operation with a 2XX response is either:*

  * ``SELECT`` and ``ASK``: a SPARQL Results Document in XML, JSON, or CSV/TSV format.
  * ``DESCRIBE`` and ``CONSTRUCT``: an RDF graph serialized, for example, in the RDF/XML syntax, or an equivalent RDF graph serialization.

The package, though it does not contain a full SPARQL parser, makes an attempt to determine the query type
when the query is set. This should work in most of the cases, but there is a possibility to set this manually, in case something
goes wrong.

Automatic conversion of the results
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

To make processing somewhat easier, the package can do some conversions automatically from the return result. These are:

* for XML, the `xml.dom.minidom `_ is used to convert the result stream into a ``Python representation of a DOM tree``.
* for JSON, the `json `_ package to generate a ``Python dictionary``.
* for CSV or TSV, a simple ``string``.
* For RDF/XML and JSON-LD, the `RDFLib `_ package is used to convert the result into a ``Graph`` instance.
* For RDF Turtle/N3, a simple ``string``.


There are two ways to generate this conversion:

* use ``ret.convert()`` in the return result from ``sparql.query()`` in the code above
* use ``sparql.queryAndConvert()`` to get the converted result right away, if the intermediate stream is not used


For example, in the code below:

.. code-block:: python

    try :
        sparql.setReturnFormat(SPARQLWrapper.JSON)
        ret = sparql.query()
        d = ret.convert()
    except Exception as e:
        print(e)


the value of ``d`` is a Python dictionary of the query result, based on the `SPARQL Query Results JSON Format `_.


Partial interpretation of the results
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

Further help is to offer an extra, partial interpretation of the results, again to cover
most of the practical use cases.
Based on the `SPARQL Query Results JSON Format `_, the :class:`SPARQLWrapper.SmartWrapper.Bindings` class
can perform some simple steps in decoding the JSON return results. If :class:`SPARQLWrapper.SmartWrapper.SPARQLWrapper2`
is used instead of :class:`SPARQLWrapper.Wrapper.SPARQLWrapper`, this result format is generated. Note that this relies on a JSON format only,
ie, it has to be checked whether the SPARQL service can return JSON or not.

Here is a simple code that makes use of this feature:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper2

    sparql = SPARQLWrapper2("http://example.org/sparql")
    sparql.setQuery("""
        SELECT ?subj ?prop
        WHERE {
            ?subj ?prop ?obj
        }
        """
    )

    try:
        ret = sparql.query()
        print(ret.variables)  # this is an array consisting of "subj" and "prop"
        for binding in ret.bindings:
            # each binding is a dictionary. Let us just print the results
            print(f"{binding['subj'].value}, {binding['subj'].type}")
            print(f"{binding['prop'].value}, {binding['prop'].type}")
    except Exception as e:
        print(e)


To make this type of code even easier to realize, the ``[]`` and ``in`` operators are also implemented
on the result of :class:`SPARQLWrapper.SmartWrapper.Bindings`. This can be used to check and find a particular binding (ie, particular row
in the return value). This features becomes particularly useful when the ``OPTIONAL`` feature of SPARQL is used. For example:

.. code-block:: python

    from SPARQLWrapper import SPARQLWrapper2

    sparql = SPARQLWrapper2("http://example.org/sparql")
    sparql.setQuery("""
        SELECT ?subj ?obj ?opt
        WHERE {
            ?subj  ?obj .
            OPTIONAL {
                ?subj  ?opt
            }
        }
        """
    )

    try:
        ret = sparql.query()
        print(ret.variables)  # this is an array consisting of "subj", "obj", "opt"
        if ("subj", "prop", "opt") in ret:
            # there is at least one binding covering the optional "opt", too
            bindings = ret["subj", "obj", "opt"]
            # bindings is an array of dictionaries with the full bindings
            for b in bindings:
                subj = b["subj"].value
                o = b["obj"].value
                opt = b["opt"].value
                # do something nice with subj, o, and opt

        # another way of accessing to values for a single variable:
        # take all the bindings of the "subj"
        subjbind = ret.getValues("subj")  # an array of Value instances
        ...
    except Exception as e:
        print(e)


GET or POST
^^^^^^^^^^^

By default, all SPARQL services are invoked using HTTP **GET** verb. However,
**POST** might be useful if the size of the query
extends a reasonable size; this can be set in the query instance.

Note that some combinations may not work yet with all SPARQL processors
(e.g., there are implementations where **POST + JSON return** does not work).
Hopefully, this problem will eventually disappear.


SPARQL Endpoint Implementations
===============================

Introduction
------------

From `SPARQL 1.1 Specification `_:

The response body of a successful query operation with a 2XX response is either:

- `SELECT` and `ASK`: a SPARQL Results Document in XML, JSON, or CSV/TSV format.
- `DESCRIBE` and `CONSTRUCT`: an **RDF graph serialized**, for example, in the RDF/XML syntax, or an equivalent RDF graph serialization.

The fact is that the **parameter key** for the choice of the **output format** is not defined.
Virtuoso uses `format`, Fuseki uses `output`, rasqual seems to use `results`, etc...
Also, in some cases HTTP Content Negotiation can/must be used.


ClioPatria
----------

:Website: `The SWI-Prolog Semantic Web Server `_
:Documentation: Search 'sparql' in ``_.
:Uses: Parameters **and** Content Negotiation.
:Parameter key: ``format``.
:Parameter value: MUST be one of these values: ``rdf+xml``, ``json``, ``csv``, ``application/sparql-results+xml`` or ``application/sparql-results+json``.


OpenLink Virtuoso
-----------------
:Website: `OpenLink Virtuoso `_
:Parameter key: ``format`` or ``output``.
:JSON-LD (application/ld+json): supported (in CONSTRUCT and DESCRIBE).

- Parameter value, like directly: "text/html" (HTML), "text/x-html+tr" (HTML (Faceted Browsing Links)), "application/vnd.ms-excel",
  "application/sparql-results+xml" (XML), "application/sparql-results+json" (JSON), "application/javascript" (Javascript), "text/turtle" (Turtle), "application/rdf+xml" (RDF/XML),
  "text/plain" (N-Triples), "text/csv" (CSV), "text/tab-separated-values" (TSV)
- Parameter value, like indirectly:
  "HTML" (alias text/html), "JSON" (alias application/sparql-results+json), "XML" (alias application/sparql-results+xml), "TURTLE" (alias text/rdf+n3), JavaScript (alias application/javascript)
  See ``_

- For a ``SELECT`` query type, the default return mimetype (if ``Accept: */*`` is sent) is ``application/sparql-results+xml``
- For a ``ASK`` query type, the default return mimetype (if ``Accept: */*`` is sent) is ``text/html``
- For a ``CONSTRUCT`` query type, the default return mimetype (if ``Accept: */*`` is sent) is ``text/turtle``
- For a ``DESCRIBE`` query type, the default return mimetype (if ``Accept: */*`` is sent) is ``text/turtle``


Fuseki
------
:Website: `Fuseki `_
:Uses: Parameters **and** Content Negotiation.
:Parameter key: ``format`` or ``output`` (`Fuseki 1 `_, `Fuseki 2 `_).
:JSON-LD (application/ld+json): supported (in CONSTRUCT and DESCRIBE).

- `Fuseki 1 - Short names for "output=" : "json", "xml", "sparql", "text", "csv", "tsv", "thrift" `_
- `Fuseki 2 - Short names for "output=" : "json", "xml", "sparql", "text", "csv", "tsv", "thrift" `_
- If a non-expected short name is used, the server returns an "Error 400: Can't determine output serialization"
- Valid alias for SELECT and ASK: "json", "xml", csv", "tsv"
- Valid alias for DESCRIBE and CONSTRUCT: "json" (alias for json-ld ONLY in Fuseki 2), "xml"
- Valid mimetype for DESCRIBE and CONSTRUCT: "application/ld+json"
- Default return mimetypes: For a SELECT and ASK query types, the default return mimetype (if Accept: */* is sent) is application/sparql-results+json
- Default return mimetypes: For a DESCRIBE and CONTRUCT query types, the default return mimetype (if Accept: */* is sent) is text/turtle
- In case of a bad formed query, Fuseki 1 returns 200 instead of 400.


Eclipse RDF4J
-------------
:Website: `Eclipse RDF4J (formerly known as OpenRDF Sesame) `_
:Documentation: ``_, ``_
:Uses: Only content negotiation (no URL parameters).
:Parameter: If an unexpected parameter is used, the server ignores it.
:JSON-LD (application/ld+json): supported (in CONSTRUCT and DESCRIBE).

- SELECT

  - ``application/sparql-results+xml`` (DEFAULT if ``Accept: */*`` is sent))
  - ``application/sparql-results+json`` (also ``application/json``)
  - ``text/csv``
  - ``text/tab-separated-values``
  - Other values: ``application/x-binary-rdf-results-table``

- ASK

  - ``application/sparql-results+xml`` (DEFAULT if ``Accept: */*`` is sent))
  - ``application/sparql-results+json``
  - Other values: ``text/boolean``
  - **Not supported**: ``text/csv``
  - **Not supported**: ``text/tab-separated-values``

- CONSTRUCT

  - ``application/rdf+xml``
  - ``application/n-triples`` (DEFAULT if ``Accept: */*`` is sent)
  - ``text/turtle``
  - ``text/n3``
  - ``application/ld+json``
  - Other acceptable values: ``application/n-quads``, ``application/rdf+json``, ``application/trig``, ``application/trix``, ``application/x-binary-rdf``
  - ``text/plain`` (returns ``application/n-triples``)
  - ``text/rdf+n3`` (returns ``text/n3``)
  - ``text/x-nquads`` (returns ``application/n-quads``)

- DESCRIBE

  - ``application/rdf+xml``
  - ``application/n-triples`` (DEFAULT if ``Accept: */*`` is sent)
  - ``text/turtle``
  - ``text/n3``
  - ``application/ld+json``
  - Other acceptable values: ``application/n-quads``, ``application/rdf+json``, ``application/trig``, ``application/trix``, ``application/x-binary-rdf``
  - ``text/plain`` (returns ``application/n-triples``)
  - ``text/rdf+n3`` (returns ``text/n3``)
  - ``text/x-nquads`` (returns ``application/n-quads``)


RASQAL
------
:Website: `RASQAL `_
:Documentation: ``_
:Parameter key: ``results``.
:JSON-LD (application/ld+json): NOT supported.

Uses roqet as RDF query utility (see ``_)
For variable bindings, the values of FORMAT vary upon what Rasqal supports but include simple
for a simple text format (default), xml for the SPARQL Query Results XML format, csv for SPARQL CSV,
tsv for SPARQL TSV, rdfxml and turtle for RDF syntax formats, and json for a JSON version of the results.

For RDF graph results, the values of FORMAT are ntriples (N-Triples, default),
rdfxml-abbrev (RDF/XML Abbreviated), rdfxml (RDF/XML), turtle (Turtle),
json (RDF/JSON resource centric), json-triples (RDF/JSON triples) or
rss-1.0 (RSS 1.0, also an RDF/XML syntax).


Marklogic
---------
:Website: `Marklogic `_
:Uses: Only content negotiation (no URL parameters).
:JSON-LD (application/ld+json): NOT supported.

`You can use following methods to query triples `_:

- SPARQL mode in Query Console. For details, see Querying Triples with SPARQL
- XQuery using the semantics functions, and Search API, or a combination of XQuery and SPARQL. For details, see Querying Triples with XQuery or JavaScript.
- HTTP via a SPARQL endpoint. For details, see Using Semantics with the REST Client API.

`Formats are specified as part of the HTTP Accept headers of the REST request. `_
When you query the SPARQL endpoint with REST Client APIs, you can specify the result output format (See ``_. The response type format depends on the type of query and the MIME type in the HTTP Accept header.

This table describes the MIME types and Accept Header/Output formats (MIME type) for different types of SPARQL queries. (See ``_ and ``_)

- SELECT

  - application/sparql-results+xml
  - application/sparql-results+json
  - text/html
  - text/csv

- ASK queries return a boolean (true or false).

- CONSTRUCT or DESCRIBE

  - application/n-triples
  - application/rdf+json
  - application/rdf+xml
  - text/turtle
  - text/n3
  - application/n-quads
  - application/trig


AllegroGraph
------------
:Website: `AllegroGraph `_
:Documentation: ``_
:Uses: Only content negotiation (no URL parameters).
:Parameter: The server always looks at the Accept header of a request, and tries to
  generate a response in the format that the client asks for. If this fails,
  a 406 response is returned. When no Accept, or an Accept of */* is specified,
  the server prefers text/plain, in order to make it easy to explore the interface from a web browser.
:JSON-LD (application/ld+json): NOT supported.


- SELECT

  - application/sparql-results+xml (DEFAULT if Accept: */* is sent)
  - application/sparql-results+json (and application/json)
  - text/csv
  - text/tab-separated-values
  - OTHERS: application/sparql-results+ttl, text/integer, application/x-lisp-structured-expression, text/table, application/processed-csv, text/simple-csv, application/x-direct-upis

- ASK

  - application/sparql-results+xml (DEFAULT if Accept: */* is sent)
  - application/sparql-results+json (and application/json)
  - Not supported: text/csv
  - Not supported: text/tab-separated-values

- CONSTRUCT

  - application/rdf+xml (DEFAULT if Accept: */* is sent)
  - text/rdf+n3
  - OTHERS: text/integer, application/json, text/plain, text/x-nquads, application/trix, text/table, application/x-direct-upis

- DESCRIBE

  - application/rdf+xml (DEFAULT if Accept: */* is sent)
  - text/rdf+n3


4store
------
:Website: `4store `_
:Documentation: ``_
:Uses: Parameters **and** Content Negotiation.
:Parameter key: ``output``.
:Parameter value: alias. If an unexpected alias is used, the server is not working properly.
:JSON-LD (application/ld+json): NOT supported.


- SELECT

  - application/sparql-results+xml (alias xml) (DEFAULT if Accept: */* is sent))
  - application/sparql-results+json or application/json (alias json)
  - text/csv (alias csv)
  - text/tab-separated-values (alias tsv). Returns "text/plain" in GET.
  - Other values: text/plain, application/n-triples

- ASK

  - application/sparql-results+xml (alias xml) (DEFAULT if Accept: */* is sent))
  - application/sparql-results+json or application/json (alias json)
  - text/csv (alias csv)
  - text/tab-separated-values (alias tsv). Returns "text/plain" in GET.
  - Other values: text/plain, application/n-triples

- CONSTRUCT

  - application/rdf+xml (alias xml) (DEFAULT if Accept: */* is sent)
  - text/turtle (alias "text")

- DESCRIBE

  - application/rdf+xml (alias xml) (DEFAULT if Accept: */* is sent)
  - text/turtle (alias "text")

:Valid alias for SELECT and ASK: "json", "xml", csv", "tsv" (also "text" and "ascii")
:Valid alias for DESCRIBE and CONSTRUCT: "xml", "text" (for turtle)


Blazegraph
----------
:Website: `Blazegraph (Formerly known as Bigdata) `_ & `NanoSparqlServer `_
:Documentation: ``_
:Uses: Parameters **and** Content Negotiation.
:Parameter key: ``format`` (available since version 1.4.0). `Setting this parameter will override any Accept Header that is present `_
:Parameter value: alias. If an unexpected alias is used, the server is not working properly.
:JSON-LD (application/ld+json): NOT supported.

- SELECT

  - application/sparql-results+xml (alias xml) (DEFAULT if Accept: */* is sent))
  - application/sparql-results+json or application/json (alias json)
  - text/csv
  - text/tab-separated-values
  - Other values: application/x-binary-rdf-results-table

- ASK

  - application/sparql-results+xml (alias xml) (DEFAULT if Accept: */* is sent))
  - application/sparql-results+json or application/json (alias json)

- CONSTRUCT

  - application/rdf+xml (alias xml) (DEFAULT if Accept: */* is sent)
  - text/turtle (returns text/n3)
  - text/n3

- DESCRIBE

  - application/rdf+xml (alias xml) (DEFAULT if Accept: */* is sent)
  - text/turtle (returns text/n3)
  - text/n3

:Valid alias for SELECT and ASK: "xml", "json"
:Valid alias for DESCRIBE and CONSTRUCT: "xml", "json" (but it returns unexpected "application/sparql-results+json")


GraphDB
-------
:Website: `GraphDB, formerly known as OWLIM (OWLIM-Lite, OWLIM-SE) `_
:Documentation: ``_
:Uses: Only content negotiation (no URL parameters).
:Note: If the Accept value is not within the expected ones, the server returns a 406 "No acceptable file format found."
:JSON-LD (application/ld+json): supported (in CONSTRUCT and DESCRIBE).

- SELECT

  - application/sparql-results+xml, application/xml (.srx file)
  - application/sparql-results+json, application/json (.srj file)
  - text/csv (DEFAULT if Accept: */* is sent)
  - text/tab-separated-values

- ASK

  - application/sparql-results+xml, application/xml (.srx file)
  - application/sparql-results+json (DEFAULT if Accept: */* is sent), application/json (.srj file)
  - NOT supported: text/csv, text/tab-separated-values

- CONSTRUCT

  - application/rdf+xml, application/xml (.rdf file)
  - text/turtle (.ttl file)
  - application/n-triples (.nt file) (DEFAULT if Accept: */* is sent)
  - text/n3, text/rdf+n3 (.n3 file)
  - application/ld+json (.jsonld file)

- DESCRIBE

  - application/rdf+xml, application/xml (.rdf file)
  - text/turtle (.ttl file)
  - application/n-triples (.nt file) (DEFAULT if Accept: */* is sent)
  - text/n3, text/rdf+n3 (.n3 file)
  - application/ld+json (.jsonld file)


Stardog
-------
:Website: `Stardog `_
:Documentation: ``_ (looks outdated)
:Uses: Only content negotiation (no URL parameters).
:Parameter key: If an unexpected parameter is used, the server ignores it.
:JSON-LD (application/ld+json): supported (in CONSTRUCT and DESCRIBE).


- SELECT

  - application/sparql-results+xml (DEFAULT if Accept: */* is sent)
  - application/sparql-results+json
  - text/csv
  - text/tab-separated-values
  - Other values: application/x-binary-rdf-results-table

- ASK

  - application/sparql-results+xml (DEFAULT if Accept: */* is sent)
  - application/sparql-results+json
  - Other values: text/boolean
  - Not supported: text/csv
  - Not supported: text/tab-separated-values

- CONSTRUCT

  - application/rdf+xml
  - text/turtle (DEFAULT if Accept: */* is sent)
  - text/n3
  - application/ld+json
  - Other acceptable values: application/n-triples, application/x-turtle, application/trig, application/trix, application/n-quads

- DESCRIBE

  - application/rdf+xml
  - text/turtle (DEFAULT if Accept: */* is sent)
  - text/n3
  - application/ld+json
  - Other acceptable values: application/n-triples, application/x-turtle, application/trig, application/trix, application/n-quads

Ontop
-------
:Website: `Ontop VKG `_
:Documentation: ``
:Uses: Only content negotiation (no URL parameters).
:Parameter key: If an unexpected parameter is used, the server ignores it.


- SELECT

  - application/sparql-results+json (DEFAULT if Accept: */* is sent)
  - application/sparql-results+xml
  - text/csv (versions before Ontop 5.2 returned text/sparql-results+csv)
  - text/tab-separated-values (versions before Ontop 5.2 returned text/sparql-results+tsv)

- ASK

  - application/sparql-results+json (DEFAULT if Accept: */* is sent)
  - application/sparql-results+xml
  - Other values: text/boolean

- CONSTRUCT

  - text/turtle (DEFAULT if Accept: */* is sent)
  - application/rdf+xml
  - text/n3
  - application/ld+json
  - Other acceptable values: application/n-triples, application/n-quads, rdf+json, rdf+xml

- DESCRIBE

  - text/turtle (DEFAULT if Accept: */* is sent)
  - application/rdf+xml
  - application/rdf+json
  - text/n3
  - application/ld+json
  - Other acceptable values: application/n-triples, application/n-quads

Development
===========

Requirements
------------

The `RDFLib `_ package is used for RDF parsing.

This package is imported in a lazy fashion, i.e. only when needed. If the user never intends to use the
RDF format, the RDFLib package is not imported and the user does not have to install it.

Source code
-----------

The source distribution contains:

-  ``SPARQLWrapper``: the Python package. You should copy the directory
   somewhere into your PYTHONPATH. Alternatively, you can also run
   the distutils scripts: ``python setup.py install``

-  ``test``: some unit and integrations tests. In order to run the tests
   some packages have to be installed before. So please install the dev packages:
   ``pip install '.[dev]'``

-  ``scripts``: some scripts to run the package against some SPARQL endpoints.

-  ``docs``: the documentation.

Community
---------

Community support is available through the RDFlib developer's discussion group `rdflib-dev `_.
The `archives `_. from the old mailing list are still available.

Issues
------

Please, `report any issue to github `_.

Documentation
-------------

The `SPARQLWrapper documentation is available online `_.

Other interesting documents are the latest `SPARQL 1.1 Specification (W3C Recommendation 21 March 2013) `_
and the initial `SPARQL Specification (W3C Recommendation 15 January 2008) `_.

License
-------

The SPARQLWrapper package is licensed under `W3C license`_.

.. _W3C license: https://www.w3.org/Consortium/Legal/2015/copyright-software-and-document

Acknowledgement
---------------

The package was greatly inspired by `Lee Feigenbaum's similar package for Javascript `_.

Developers involved:

* Ivan Herman 
* Sergio Fernández 
* Carlos Tejo Alonso 
* Alexey Zakhlestin 

Organizations involved:

* `World Wide Web Consortium `_
* `Salzburg Research `_
* `Foundation CTIC `_

.. |Build Status| image:: https://github.com/RDFLib/sparqlwrapper/actions/workflows/test.yml/badge.svg
   :target: https://github.com/RDFLib/sparqlwrapper/actions/workflows/test.yml
.. |PyPi version| image:: https://badge.fury.io/py/SPARQLWrapper.svg
   :target: https://pypi.python.org/pypi/SPARQLWrapper

Owner

  • Name: RDFlib
  • Login: RDFLib
  • Kind: organization

RDFlib, the GitHub organization, is a volunteer-maintained collection of Python tools for working with RDF data.

GitHub Events

Total
  • Issues event: 6
  • Watch event: 31
  • Issue comment event: 5
  • Push event: 3
  • Pull request event: 5
  • Pull request review event: 4
  • Fork event: 2
Last Year
  • Issues event: 6
  • Watch event: 31
  • Issue comment event: 5
  • Push event: 3
  • Pull request event: 5
  • Pull request review event: 4
  • Fork event: 2

Committers

Last synced: 9 months ago

All Time
  • Total Commits: 580
  • Total Committers: 29
  • Avg Commits per committer: 20.0
  • Development Distribution Score (DDS): 0.672
Past Year
  • Commits: 10
  • Committers: 4
  • Avg Commits per committer: 2.5
  • Development Distribution Score (DDS): 0.3
Top Committers
Name Email Commits
Carlos Tejo c****o@l****o 190
Sergio Fernández w****r@a****g 179
eggplants w****w@y****p 79
Alexey Zakhlestin i****s@g****m 63
Nicholas Car n****r@s****m 13
Iwan Aucamp a****a@g****m 9
t0b3 t****r@g****m 8
Vincent Emonet v****t@g****m 7
Nolan Nichols n****s@g****m 7
feger m****r@h****e 2
Hugo h****k 2
Marcelo Jorge Vieira m****l@a****m 2
Olivier Berger o****r@t****u 2
Trevor Andersen t****n@g****m 2
nklsbckmnn 5****n 1
chrysn c****n@f****g 1
Satrajit Ghosh s****h@g****m 1
PandaWill P****l 1
Natanael Arndt a****n@g****m 1
Martijn van Iersel m****l@g****m 1
Marat Charlaganov g****b@c****u 1
Jörn Hees j****s 1
Gunnar Aastrand Grimnes g****l@g****m 1
Edward Betts e****d@4****m 1
David Cottrell c****l 1
Dan Michael O. Heggø d****o@g****m 1
Chris Lamb c****s@c****k 1
Peter Hopfgartner p****r@o****i 1
Benjamin Cogrel b****l@b****r 1

Issues and Pull Requests

Last synced: 6 months ago

All Time
  • Total issues: 70
  • Total pull requests: 47
  • Average time to close issues: 10 months
  • Average time to close pull requests: 3 months
  • Total issue authors: 47
  • Total pull request authors: 20
  • Average comments per issue: 3.6
  • Average comments per pull request: 2.11
  • Merged pull requests: 26
  • Bot issues: 0
  • Bot pull requests: 0
Past Year
  • Issues: 4
  • Pull requests: 3
  • Average time to close issues: N/A
  • Average time to close pull requests: 5 days
  • Issue authors: 4
  • Pull request authors: 2
  • Average comments per issue: 0.25
  • Average comments per pull request: 0.67
  • Merged pull requests: 1
  • Bot issues: 0
  • Bot pull requests: 0
Top Authors
Issue Authors
  • eggplants (10)
  • WolfgangFahl (7)
  • nicholascar (3)
  • aucampia (3)
  • wikier (2)
  • chiarcos (2)
  • lu-pl (2)
  • dayures (1)
  • felixonmars (1)
  • hugobartolo (1)
  • PR0CK0 (1)
  • rjalexa (1)
  • fcbr (1)
  • hendursaga (1)
  • milan252525 (1)
Pull Request Authors
  • eggplants (17)
  • aucampia (7)
  • nicholascar (4)
  • vemonet (2)
  • t0b3 (2)
  • salander93 (2)
  • phopfgartner (2)
  • nklsbckmnn (1)
  • abuonomo (1)
  • lucaswerkmeister (1)
  • amin-siemens (1)
  • cottrell (1)
  • dayures (1)
  • ananya2711 (1)
  • hbruch (1)
Top Labels
Issue Labels
bug (3) discussion (2) enhancement (1) external-bug (1) python2-to-python3 (1)
Pull Request Labels
enhancement (1)

Packages

  • Total packages: 23
  • Total downloads:
    • pypi 865,972 last-month
  • Total docker downloads: 1,713
  • Total dependent packages: 52
    (may contain duplicates)
  • Total dependent repositories: 356
    (may contain duplicates)
  • Total versions: 65
  • Total maintainers: 9
pypi.org: sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 30
  • Dependent Packages: 46
  • Dependent Repositories: 345
  • Downloads: 865,924 Last month
  • Docker Downloads: 1,713
Rankings
Dependent packages count: 0.3%
Downloads: 0.5%
Dependent repos count: 0.8%
Average: 1.8%
Docker downloads count: 2.2%
Stargazers count: 2.8%
Forks count: 4.3%
Maintainers (3)
Last synced: 6 months ago
alpine-v3.18: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 5.5%
Forks count: 10.2%
Stargazers count: 11.6%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.18: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 5.5%
Forks count: 10.2%
Stargazers count: 11.6%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.13: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 7.0%
Stargazers count: 7.2%
Average: 8.4%
Dependent packages count: 19.5%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.14: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 7.1%
Stargazers count: 7.4%
Average: 9.0%
Dependent packages count: 21.7%
Maintainers (1)
Last synced: 6 months ago
alpine-edge: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 4
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Average: 9.6%
Forks count: 11.4%
Dependent packages count: 13.3%
Stargazers count: 13.5%
Maintainers (1)
Last synced: 6 months ago
alpine-edge: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 5
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Average: 9.7%
Forks count: 11.1%
Stargazers count: 12.9%
Dependent packages count: 14.6%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.15: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 7.5%
Stargazers count: 8.0%
Average: 10.3%
Dependent packages count: 25.6%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.16: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 7.9%
Stargazers count: 8.7%
Average: 11.0%
Dependent packages count: 27.3%
Maintainers (1)
Last synced: 6 months ago
pypi.org: sparqlstreamwrapper

SPARQL Endpoint interface to permit streaming queries from Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 15 Last month
Rankings
Stargazers count: 2.8%
Forks count: 4.3%
Dependent packages count: 6.9%
Average: 11.1%
Dependent repos count: 30.5%
Maintainers (2)
Last synced: 6 months ago
alpine-v3.17: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Forks count: 9.7%
Stargazers count: 10.7%
Average: 11.9%
Dependent packages count: 27.3%
Maintainers (1)
Last synced: 6 months ago
conda-forge.org: sparqlwrapper
  • Versions: 5
  • Dependent Packages: 6
  • Dependent Repositories: 9
Rankings
Dependent packages count: 9.0%
Dependent repos count: 11.6%
Average: 13.8%
Forks count: 16.5%
Stargazers count: 17.9%
Last synced: 6 months ago
pypi.org: sparqlwrapper-mosorio

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 8 Last month
Rankings
Stargazers count: 2.8%
Forks count: 4.3%
Dependent packages count: 10.1%
Average: 20.0%
Dependent repos count: 21.5%
Downloads: 61.2%
Maintainers (1)
Last synced: 6 months ago
pypi.org: sparqlwrappermosorio

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 1
  • Downloads: 8 Last month
Rankings
Stargazers count: 2.8%
Forks count: 4.3%
Dependent packages count: 10.1%
Average: 20.8%
Dependent repos count: 21.5%
Downloads: 65.4%
Maintainers (1)
Last synced: 6 months ago
pypi.org: sparqlwrapper.skipssl

SPARQL Endpoint interface to Python

  • Versions: 3
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Downloads: 17 Last month
Rankings
Dependent packages count: 6.6%
Average: 26.7%
Forks count: 30.5%
Dependent repos count: 30.6%
Stargazers count: 39.1%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.22: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.21: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.22: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.19: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.19: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Last synced: 6 months ago
alpine-v3.20: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.21: py3-sparqlwrapper-pyc

Precompiled Python bytecode for py3-sparqlwrapper

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago
alpine-v3.20: py3-sparqlwrapper

SPARQL Endpoint interface to Python

  • Versions: 1
  • Dependent Packages: 0
  • Dependent Repositories: 0
Rankings
Dependent repos count: 0.0%
Dependent packages count: 0.0%
Average: 100%
Maintainers (1)
Last synced: 6 months ago

Dependencies

.github/workflows/test.yml actions
  • actions/checkout v2 composite
  • actions/setup-python v2 composite
docs/requirements.docs.txt pypi
  • sphinx <5
  • sphinx-rtd-theme *
requirements.development.txt pypi
  • mypy >=0.931 development
  • pandas-stubs >=1.2.0.48 development
  • setuptools >=3.7.1 development
requirements.optional.txt pypi
  • pandas >=1.3.5
requirements.txt pypi
  • rdflib >=6.1.1