Categories
Free Software Virtual Reality

OpenXR on GNU/Linux With OpenHMD and Monado

With the release of Debian 10, it is possible to build Monado on Debian in order to use the OpenXR SDK with headsets supported by OpenHMD.

Hopefully this means a future OpenXR-enabled Firefox will work with them as well.

In the meantime we can at least run the hello_xr demo. Here’s how… (Updated April 2020.)

Build and install OpenHMD:

https://github.com/OpenHMD/OpenHMD

Build and install Monado:

https://gitlab.freedesktop.org/monado/monado/

Build and install the OpenXR SDK:

https://github.com/KhronosGroup/OpenXR-SDK-Source/

To set the Rift to non-desktop (before each run, if not set in the kernel), run:

xrandr --output HDMI-0 --prop --set non-desktop 1

If needed you can check this by running:

xrandr --prop

Then to run the OpenXR-SDK hello_xr demo, run the following in the OpenXR-SDK directory:

XR_RUNTIME_JSON=/usr/local/share/openxr/1/openxr_monado.json ./build/linux_debug/src/tests/hello_xr/hello_xr -g Vulkan

Which will show the pocket universe captured in a screenshot the top of this post in your VR headset.

Categories
Crypto Free Software

Trusted Third Party Hardware

From the point of view of the Bitcoin white paper, trusted platform modules, programmable secure elements, and secure enclaves are all examples of the presence of trusted third parties. They are “Treacherous Computing” hardware that someone other than you ultimately controls, and who you must trust to act in your best interests.

If it was the case that the use of hardware that obeys trusted thirds parties in order to improve the security or speed of cryptocurrencies offered obvious benefits that cannot be achieved in any other way then objecting to this on ideological grounds might seem like an example of Emerson’s maxim that “a foolish consistency is the hobgoblin of little minds”. But trusted third party hardware is not necessarily always more secure and trustworthy than hardware or software that the user controls.

Promoting trusted third party hardware solutions in cryptocurrency without acknowledging this should therefore be questioned both ideologically and pragmatically.

Ideologically because the CEO of a hardware wallet company should not have more control of the systems that you use to hold your cryptocurrency than you do, and they should not be beholden to their chip vendor for that power either.

Pragmatically because adding more places for malware to infect and hide in, and in ways that may be impossible to detect and remove, does not make things more secure.

Given all this it is important to look beyond the marketing of trusted third party hardware. Here are some articles describing issues with such systems.

Secure Elements

Government agencies do pressure chip producers to include backdoors to their products, so why should one suppose it would be different with SE, especially knowing that these are being used for financial transactions? The user would never learn about this, because of the nature of the SE.

Is “Banking-grade Security” Good Enough for Your Bitcoins?

A team of security engineers from Rapid7 at Black Hat USA 2016 conference in Las Vegas demonstrated how a small and simple modifications to equipment would be enough for attackers to bypass the Chip-and-PIN protections and enable unauthorized transactions.

This ATM Hack Allows Crooks to Steal Money From Chip-and-Pin Cards

The Infineon Bug

A crippling flaw in a widely used code library has fatally undermined the security of millions of encryption keys used in some of the highest-stakes settings, including national identity cards, software- and application-signing, and trusted platform modules protecting government and corporate computers.

Millions of high-security crypto keys crippled by newly discovered flaw

A vulnerability was identified in the RSA key generation method used by Trusted Platform Modules (TPMs) manufactured by Infineon and contained in some Lenovo products. RSA public keys generated by the Infineon TPM for use by certain software programs should be considered insecure.

RSA Keys Generated by Infineon TPMs are Insecure

Of course, if Infineon made this mistake, who else could have made a similar faux pas?

ROCA encryption #fail: Worse than we thought (and way worse than KRACK)

Secure Enclaves

Researchers have demonstrated using Intel’s Software Guard Extensions to hide malware and steal cryptographic keys from inside SGX’s protected enclave

Using Intel’s SGX to Attack Itself

It’s still too early to know what the full fallout from the SEP’s decryption will be, but it could open the door for password harvesting, spoofing, and other security-compromising attacks.

iOS users beware: A hacker has just published a decryption key for the Apple Secure Enclave, which is responsible for processing Touch ID transactions.

Categories
Art Free Culture Free Software Projects

Neterarti – Net Art Social Networking Freedom

neterarti-screnshot-marc

(Image via Marc Garrett)

https://neterarti.furtherfield.org/

Neterarti is Furtherfield‘s new social network for net artists based on the GNU social Free Software social network system. If you’re familiar with Twitter it’s very similar, and it’s easy to access via the web or desktop and mobile apps.

Sign up and start netting and arting!

Categories
Art Computing Free Software Generative Art Projects Uncategorized

Minara 0.4.0

minara-cairo-gtk-test

I’ve been making the regular (accidentally) six-yearly update to Minara, my vector graphics program.

The new version switches from GLUT to Gtk for the windowing system, from GLU to Cairo for the renderer, and from C to pure Scheme for the core application. It’s all written in The GNU project’s Guile Scheme system.

Minara is Lisp all the way down: the application, tools, and graphics files are all written in Scheme. It’s designed as an environment for 2D generative vector art hacking.

Categories
Art Computing Free Software Projects

“Art Is” Wordcloud (Streaming Aesthetics)

art_is wordcloud

Words used after the phrase “art is” on Twitter (minus some stopwords).

Processing code in the streaming-aesthetics repository.

Categories
Art Computing Free Software Generative Art Howto Uncategorized

WordNet

We can use NLTK’s support for WordNet to help generate and classify text.

from nltk.corpus import wordnet as wn
from nltk.corpus import sentiwordnet as swn

def make_synset(word, category='n', number='01'):
    """Conveniently make a synset"""
    number = int(number)
    return wn.synset('%s.%s.%02i' % (word, category, number))

>>> dog = make_synset('dog')
>>> dog.definition
'a member of the genus Canis (probably descended from the common wolf) that has been domesticated by man since prehistoric times; occurs in many breeds'

A synset is WordNet’s representation of a word/concept. Looking at the definition confirms that we have the synset for canis familiaris rather than persecution or undesirability.

>>> dog.hypernyms()
[Synset('domestic_animal.n.01'), Synset('canine.n.02')]

Hypernyms are more general concepts. ‘dog’ has two of them, which shows that WordNet is not arranged in a simple tree of concepts. This makes checking for common ancestors slightly more complex but represents concepts more realistically.

>>> dog.hyponyms()
[Synset('puppy.n.01'), Synset('great_pyrenees.n.01'), Synset('basenji.n.01'), Synset('newfoundland.n.01'), Synset('lapdog.n.01'), Synset('poodle.n.01'), Synset('leonberg.n.01'), Synset('toy_dog.n.01'), Synset('spitz.n.01'), Synset('pooch.n.01'), Synset('cur.n.01'), Synset('mexican_hairless.n.01'), Synset('hunting_dog.n.01'), Synset('working_dog.n.01'), Synset('dalmatian.n.02'), Synset('pug.n.01'), Synset('corgi.n.01'), Synset('griffon.n.02')]

Hyponyms are more specific concepts. ‘dog’ has several. These may have hypernyms other than ‘dog’, and may have several hyponyms themselves.

def _recurse_all_hypernyms(synset, all_hypernyms):
    synset_hypernyms = synset.hypernyms()
    if synset_hypernyms:
        all_hypernyms += synset_hypernyms
        for hypernym in synset_hypernyms:
            _recurse_all_hypernyms(hypernym, all_hypernyms)

def all_hypernyms(synset):
    """Get the set of hypernyms of the hypernym of the synset etc.
       Nouns can have multiple hypernyms, so we can't just create a depth-sorted
       list."""
    hypernyms = []
    _recurse_all_hypernyms(synset, hypernyms)
    return set(hypernyms)

>>> all_hypernyms(dog)
>>> set([Synset('chordate.n.01'), Synset('living_thing.n.01'), Synset('physical_entity.n.01'), Synset('animal.n.01'), Synset('mammal.n.01'), Synset('object.n.01'), Synset('vertebrate.n.01'), Synset('entity.n.01'), Synset('carnivore.n.01'), Synset('domestic_animal.n.01'), Synset('canine.n.02'), Synset('placental.n.01'), Synset('organism.n.01'), Synset('whole.n.02')])

We can recursively fetch the hypernyms of a synset. since ‘dog’ has two hypernyms this isn’t a single list of hypernyms.
We can use this to find how similar different words are by searching for common ancestors.
The Python WordNet library can find common hypernyms for us though.

>>> cat = make_synset('cat')
>>> cat.common_hypernyms(dog)
[Synset('chordate.n.01'), Synset('living_thing.n.01'), Synset('physical_entity.n.01'), Synset('animal.n.01'), Synset('mammal.n.01'), Synset('vertebrate.n.01'), Synset('entity.n.01'), Synset('carnivore.n.01'), Synset('object.n.01'), Synset('placental.n.01'), Synset('organism.n.01'), Synset('whole.n.02')]
>>> steel = make_synset('steel')
>>> steel.common_hypernyms(dog)
[Synset('physical_entity.n.01'), Synset('entity.n.01')]
>>> sunset = make_synset('sunset')
>>> sunset.common_hypernyms(dog)
[Synset('entity.n.01')]

As might be expected, cats and dogs are more similar than steel or sunsets.
We can recursively fetch the hyponyms of a synset. This gives us the set of objects or concepts with a kind-of relationship to the word.

def _recurse_all_hyponyms(synset, all_hyponyms):
    synset_hyponyms = synset.hyponyms()
    if synset_hyponyms:
        all_hyponyms += synset_hyponyms
        for hyponym in synset_hyponyms:
            _recurse_all_hyponyms(hyponym, all_hyponyms)

def all_hyponyms(synset):
    """Get the set of the tree of hyponyms under the synset"""
    hyponyms = []
    _recurse_all_hyponyms(synset, hyponyms)
    return set(hyponyms)

>>> all_hyponyms(dog)
set([Synset('harrier.n.02'), Synset('water_spaniel.n.01'), Synset('standard_poodle.n.01'), Synset('dandie_dinmont.n.01'), Synset('wirehair.n.01'), Synset('toy_manchester.n.01'), Synset('puppy.n.01'), Synset('briard.n.01'), Synset('beagle.n.01'), Synset('siberian_husky.n.01'), Synset('manchester_terrier.n.01'), Synset('bloodhound.n.01'), ...

WordNet has some support for synonyms and antonyms via lemmas.

def synset_synonyms(synset):
    """Get the synonyms for the synset"""
    return set([lemma.synset for lemma in synset.lemmas])

def synset_antonyms(synset):
    """Get the antonyms for [the first lemma of] the synset"""
    return set([lemma.synset for lemma in synset.lemmas[0].antonyms()])

>>> synset_synonyms(sunset)
set([Synset('sunset.n.01')])
>>> synset_antonyms(sunset)
set([Synset('dawn.n.01')])

And we can find related concepts by getting all the hyponyms of a word’s hypernynms.

def all_peers(synset):
    """Get the set of all peers of the synset (including the synset).
       If the synset has multiple hypernyms then the peers will be hyponyms of
       multiple synsets."""
    hypernyms = synset.hypernyms()
    peers = []
    for hypernym in hypernyms:
        peers += hypernym.hyponyms()
    return set(peers)

>>> all_peers(sunset)
set([Synset('zero_hour.n.01'), Synset('rush_hour.n.01'), Synset('early-morning_hour.n.01'), Synset('none.n.01'), Synset('midnight.n.01'), Synset('happy_hour.n.01'), Synset('dawn.n.01'), Synset('bedtime.n.01'), Synset('late-night_hour.n.01'), Synset('small_hours.n.01'), Synset('noon.n.01'), Synset('sunset.n.01'), Synset('twilight.n.01'), Synset('mealtime.n.01'), Synset('canonical_hour.n.01'), Synset('closing_time.n.01')])

We use sets here so that common ancestors and children appear only once, and to allow for boolean set operations on concepts.
It’s trivial to get the the word (or words) for a synset.

def synsets_words(synsets):
    """Get the set of strings for the words represented by the synsets"""
    return set([synset_word(synset) for synset in synsets])

>>> synsets_words(all_hyponyms(dog))
set(['rottweiler', 'bull mastiff', 'belgian sheepdog', 'courser', 'brabancon griffon', 'toy terrier', 'fox terrier', 'sennenhunde', 'standard poodle', 'saluki', 'pointer', 'toy spaniel', 'setter', 'giant schnauzer', 'housedog', 'papillon', 'american foxhound', 'weimaraner', 'cocker spaniel', 'basenji', 'beagle', ...

WordNet has part/whole, group and substance relationships.

>>> body = make_synset('body')
>>> body.part_meronyms()
[Synset('arm.n.01'), Synset('articulatory_system.n.01'), Synset('body_substance.n.01'), Synset('cavity.n.04'), Synset('circulatory_system.n.01'), Synset('crotch.n.02'), Synset('digestive_system.n.01'), Synset('endocrine_system.n.01'), Synset('head.n.01'), Synset('leg.n.01'), Synset('lymphatic_system.n.01'), Synset('musculoskeletal_system.n.01'), Synset('neck.n.01'), Synset('nervous_system.n.01'), Synset('pressure_point.n.01'), Synset('respiratory_system.n.01'), Synset('sensory_system.n.02'), Synset('torso.n.01'), Synset('vascular_system.n.01')]

>>> dog.member_holonyms()
[Synset('canis.n.01'), Synset('pack.n.06')]

>>> wood = make_synset('wood')
>>> wood.substance_holonyms()
[Synset('beam.n.02'), Synset('chopping_block.n.01'), Synset('lumber.n.01'), Synset('spindle.n.02')]
>>> wood.substance_meronyms()
[Synset('lignin.n.01')]

We can use hypernyms to classify words into domains using WordNet, but there’s an existing domain classification system in the form of WordNet Domains. It can be downloaded here. Code for using this can be found on Stack Overflow. But it doesn’t seem to work with nltk 3.0 (the synset numbers don’t match).

And there’s a sentiment score system for WordNet in the form of SentiWordNet. There’s an interface for it in WordNet 3.0.

def make_senti_synset(word, category='n', number='01'):
    """Conveniently make a senti_synset"""
    number = int(number)
    return swn.senti_synset('%s.%s.%02i' % (word, category, number))

def synsets_sentiments(synsets):
    """Return the objs, pos, neg and pos - neg score sums for the synsets"""
    pos = 0.0
    obj = 0.0
    neg = 0.0
    for synset in synsets:
        try:
            pos += synset.pos_score()
            obj += synset.obj_score()
            neg += synset.neg_score()
        except AttributeError, e:
            pass
    return obj, pos, neg, pos - neg

>>> happy = make_senti_synset('happy', 'a')
>>> happy.pos_score()
0.875
>>> happy.neg_score()
0.0
>>> happy.obj_score()
0.125

synsets_sentiments([make_senti_synset(word, 'a') for word in 'happy sad angry heavy light depressing'.split()])
(2.5, 1.5, 2.0, -0.5)

Not every word has a sentiment score, hence the try/except block in synsets_sentiments.

WordNet is sensitive to senses and it’s hard to automatically resolve senses when processing arbitrary text. When generating text and using WordNet to find words, it’s important (and easier) to set the correct sense for the synset.

>>> colour = make_synset('colour', 'n', 6)
>>> all_hyponyms(colour)
set([Synset('chrome_red.n.01'), Synset('primary_color.n.01'), Synset('light_brown.n.01'), Synset('sallowness.n.01'), Synset('hazel.n.04'), Synset('iron-grey.n.01'), Synset('olive_green.n.01'), Synset('tan.n.02'), Synset('pastel.n.01'), Synset('coal_black.n.01'), Synset('pinkness.n.01'), Synset('vandyke_brown.n.01'), Synset('beige.n.01'), Synset('blue.n.01'), Synset('shade.n.02'), Synset('achromatic_color.n.01'), Synset('whiteness.n.03'), Synset('coral.n.01'), Synset('chromatism.n.02'), Synset('apatetic_coloration.n.01'), ...

This gives concepts on different levels. Maybe if we try the peers of a colour.

>>> all_peers(make_synset('red'))
set([Synset('red.n.01'), Synset('pastel.n.01'), Synset('purple.n.01'), Synset('green.n.01'), Synset('olive.n.05'), Synset('complementary_color.n.01'), Synset('brown.n.01'), Synset('blue.n.01'), Synset('blond.n.02'), Synset('yellow.n.01'), Synset('orange.n.02'), Synset('pink.n.01'), Synset('salmon.n.04')])

OK maybe if we try the children of a concept.

>>> all_hyponyms(make_synset('chromatic_color'))
set([Synset('chrome_red.n.01'), Synset('light_brown.n.01'), Synset('hazel.n.04'), Synset('olive_green.n.01'), Synset('tan.n.02'), Synset('pastel.n.01'), Synset('pinkness.n.01')

Perhaps the leaf nodes.

def _recurse_leaf_hyponyms(synset, leaf_hyponyms):
    synset_hyponyms = synset.hyponyms()
    if synset_hyponyms:
        for hyponym in synset_hyponyms:
            _recurse_all_hyponyms(hyponym, leaf_hyponyms)
    else:
        leaf_hyponyms += synset

def leaf_hyponyms(synset):
    """Get the set of leaf nodes from the tree of hyponyms under the synset"""
    hyponyms = []
    _recurse_leaf_hyponyms(synset, hyponyms)
    return set(hyponyms)

>>> leaf_hyponyms(make_synset('chromatic_color'))
set([Synset('taupe.n.01'), Synset('snuff-color.n.01'), Synset('chrome_red.n.01'), Synset('light_brown.n.01'), Synset('hazel.n.04'), Synset('olive_drab.n.01'), Synset('old_gold.n.01'), Synset('chocolate.n.03'), Synset('yellowish_pink.n.01'), Synset('yellowish_brown.n.01'), Synset('tyrian_purple.n.02'), ...

That looks good. All colours, no intermediate concepts.

We can use this set of words to choose colours, or to categorize words as colours.

I hope this demonstrates that WordNet can be a very useful resource for Generative Art and Digital Humanities projects.

Categories
Free Culture Free Software Reviews

The People’s Platform

The People’s Platform” (TPP) is a frustrating read. An anti-techno-utopian critique of the economics and politics of culture on the Internet, it contains much interesting research and some useful ideas but is hamstrung by a year zero activism approach to the history and current state of the struggle for liberty and sustainability in technology and media.

Year zero activism has two planks. Firstly, the situation has never been worse and only now are activists starting to tackle it. Secondly, anyone who may appear to have previously done so is actually part of the problem. Previous activism is at best ineffective and at worst exacerbatory, previous activists were tone deaf to or in reality made worse the very issues they sought to address.

In TPP this leads at times to an almost ‘pataphysical identity of opposites. Google and Wikipedia are both “open”. Chris Anderson and Richard Stallman both use the word “free”. The nadir of this approach comes later in the book when TPP is explaining the economic and thereby cultural harm of free culture and free software:

Cohen is highlighting a value that has long been central to any progressive movement: respect for labor. From this angle it’s clear that “copyleft”, as the free culture position on copyright is sometimes called, is not “left” in the traditional sense. As Richard Stallman told me, he designed copyleft to ensure the freedom of users to redistribute and modify copies of users to redistribute and modify copies of software. Freedom to tinker is the paramount value it promotes, but a left worthy of the name has to balance that concern with the demand for equality, for parity of wealth and power.

There’s no part of this that’s right.

Stallman’s creation of copyleft was a product of the political development of Free Software in reaction to the alienation of the products of hacker labour. It’s an answer to the property question, which is a question of the left “in the traditional sense”. It entails respect for labour, and ensures that workers can charge for and be paid for their labour.

Users who modify and “tinker” with software do so via programming, that is by working as programmers, by performing the labour of software development. Software developers are first of all software users. If you are not free to use software you are certainly not free to develop it. The same is true of cultural production, a point that TPP seems slightly more open to.

“Copyleft” is not a blanket term for free culture approaches to copyright, it is the name of a particular licensing approach that seeks to address the restrictions of copyright. There is no single free culture approach to copyright. There are copyright abolitionists, copyright libertarians, copyright socialists and those, like Stallman, for whom copyright’s ironisation by copyleft is a means to a political end.

Seeking to reduce free software and free culture to a progressive left wing movement rather than retain the nonpartisan approach that has seen their successes (or, as TPP would have it, has led to identity with their proprietary others) would undermine them. It’s classic entryism, finding a successful specific social cause to shame into attempting more general radical politics. It’s an approach that is doomed to failure.

And copyleft is precisely intended to equalise wealth and power in the use of software. You can share that wealth, and you cannot exert power over anyone else to prevent them from doing so as well. What you cannot do without breaking the effectiveness of copyleft, and what each new critic of copyleft is drawn to like a moth to a flame, is to yoke copyleft’s reflexive ironisation of copyright on software or cultural work to extraneous political objectives.

TPP continues:

Copyleft, with its narrow emphasis on software freedom, even when broadened to underscore the freedom of speech implications of such a position, offers a limited political response to entrenched systems of economic privilege, and it does not advance limits on profitability or promote fair compensation. Free culture, with its emphasis on access, does not necessarily lead to a more just social order.

Ignoring the slip from free software to free culture, the slip from social to economic justice, and the inaccurate characterization of free culture as emphasizing access, this is a political erasure. Free software and free culture may not have provided grossly coercive tools to the political left but they have, by TPP’s own explanation of their redistributive and deprivileging effects, led to a more just social order. And it requires precisely the ‘pataphysics of “free” and “open” that TPP develops to argue that they limit compensation but not profit.

Later, TPP calls for the development of more socialised alternatives to Web 2.0’s ad-driven surveillance model, and for the development of more equitable alternatives to unpaid cultural workers trying to live on whuffie while making Silicon Valley CEOS rich. I agree that this is vitally important. I’ve worked on several myself. I’ve seen creators paid, clients satisfied, citizens communicating, audiences enjoying media, with millions of dollars put into the cultural economy and tens of thousands of people engaged each month by projects I’ve been involved in. There is absolutely more work to do, but ignoring existing efforts or worse conflating them with the problems they exist to address will only ensure that this is always the case.

There is another key conclusion of TPP that I agree with wholeheartedly. We need a sustainable ecosystem for culture. That is, we need technological and economic systems that sustainably align consumption and production incentives with each other and with political and creative liberty. And state and corporate mechanisms for spreading risk absolutely have a part to play in this. But as blank media levies and the deep packet inspection consequences of the proposals of “Promises To Keep” show, this is a task that needs approaching with an insight and subtlety that both pro- and anti- free culture activists often lack.

In this sense at least TPP is not year zero, it is business as usual.

Categories
Free Software Projects

ace

ace
ace is a command-line development environment for Ethereum contracts.

It’s designed to simplify writing and testing contracts. The initial (alpha!) version supports Serpent contracts and local testing using pyethereum.

You can get it here:

https://gitorious.org/robmyers/ethereum-ace/

For an Emacs mode for editing Serpent code, see here:

http://OFFLINEZIP.wpsho/2014/05/20/serpent-mode-for-emacs/

Bug reports gratefully received.

Categories
Free Software Projects

serpent-mode for Emacs

serpent-mode

serpent-mode is a GNU Emacs major mode for editing and compiling Serpent code.

Serpent is a Python-inspired language for writing smart contracts that compile to Ethereum Virtual Machine bytecode. It adds syntax highlighting and indentation for Serpent code, and allows files to be compiled from within Emacs.

You can get the serpent-mode here:

https://gitorious.org/robmyers/serpent-mode/

https://github.com/robmyers/serpent-mode

Some of the indentation code has been borrowed from Emacs’ built-in Python mode but any deficiencies are a result of my simplifying it to work with Serpent. Reports of errors and omissions gratefully received.

Categories
Free Software

Ethereum Contract Free Software Licensing

Here’s a simple example of a contract that is licensed under the GNU Affero General Public License:

LICENSE = ["Copyright 2014 Rob Myers", "Licensed GNU AGPL V3+"]
SOURCE = ["https:\/\/gitorious.org\/robmyers\/", "artworld-ethereum/"]

// Make sure we have enough gas to run the contact
if tx.value < tx.basefee * 100:
    // If not, stop
    stop

if msg.data[0] == "license":
   return(LICENSE, 2)
else if msg.data[0] == "source":
   return(SOURCE, 2)
else:
    // Return false
    return(0)

Assuming that being part of the blockchain doesn’t clash with the AGPL. Anyone? 🙂