Semi-Auto Cut-Up: Experimental Condition

Pixels without proverbial provenance, 
they both knew what they were.
Buffet Buddhists, defying cages,
tagged for progressive classification.

Wonder world, inexpressible problems,
all lonely, we live with each other.
Chronic complainers, a hundred times
a hundred, gloomy mind experiments.

The emotional surface of lost futures.
Do we know enough to know the truth?
Unconscious man grapples, but finds
little to grip, advanced, but enough?

Unreadable barbarian news file,
the experimental Machiavellian composer,
weary, whistles on the way home,
her faith in the process, crushed.

Semi-Auto Cut-Up: Another Offering (KJV)

Eat the congregation, together,
smite them afraid, blast vessels.
O ye dead life, mercy, semblance
of a kind, a measure of shadow.

But, they had no prophet, neighbors
and friends, children of fate, troubled
the Others, remember them not, no-name,
wilderness sacrifices, sore consumed.

Desolation came and passed, Death,
bare the enemy, dead, desolate,
good and great together into the land,
begat headstones, seeds unto the earth.

Father of water, turn thyself,
out to sea, high between, separate,
sing of the living, purify mouths
and soul, fetch their inheritance.

Surge, rain on the wise borderlands,
pull the stopper, flush the tidal womb,
from the unleavened, and the unclean,
reborn, another offering to be devoured.

Another A.I. assist using a recently trained neural network using the King James version (KJV) as the corpus. Since this is the first neural network I tried myself, I learned many valuable lessons. It took 2.5 days the first go around, and the result was unusable because of all the newline characters in the original text. I didn’t realize that the text would have to be pre-processed or what it would entail. I plan on writing a post about the process of making Project Gutenberg texts usable as writing with the machine co-authors.

Also, there is a point to be made about the inherent class obstacles in learning and using neural networks. The differences from having a dedicated machine running the right hardware is the difference between waiting days to train a new model or a few hours. On the other end, speed also limits how big of a data set you are willing to start with. The KJV is about 5MB, and it took 2 days. Robin Sloan’s pre-trained text is around 123MB. Spending 24+ days to train a model is a serious barrier to entry.

Semi-Auto Cut-Up: The First in the First Place

The first in the first place,
The Others, standing beside us.
Aware of destruction, strange,
ineffectual, a matter of force.

The bare path, dark and closed,
From the stairs, an ascent of
story, a complicated service, clean,
psychological, a social alone.

The world has not yet been consumed
by the light of the stars. A universe
has all time and space, experience
a sliver, taste a slice of the whole.

Poem written primarily with Robin Sloan’s Writing With the Machine neural network with a sprinkle of Webster’s and some selection, moving about and adding of pieces to turn it into something that makes sense. Strikes me as a quick method of “writing” cut-up poetry. Although, given the source material for the neural network, these will be science fictiony until I can train up another corpus.

I think I’ll be able to try a King James version assisted composition, maybe tomorrow.