Dissemination Machine

IT, Software Development, cloud, blog, google apps, concurrency, mind training, storehouse, biology, dissemination machine

war printing December 1, 2012

Filed under: Uncategorized — karmafeast @ 00:07
Tags: , ,

Asuras with karma shall not pass the sea of milk

We must be getting close to printing modularized weapon platforms printable in real time.

e.g. Rapid 3d printing for military grade materials using common munitions for projectile or tool facilitation (e.g.  Power hammer) if possible.  like the 40k folk do.

Airborn (orbital) carrier or group of networked plane (rail acceleration, deployment platforms) systems deploys munitions, items or other dynamically generated war hardware for remote robots ideally, or human soldiers, to use.

soldier  might respond to situation in real time by requesting a custom object to be created by printing group – may be assembly step by placing multiple altitudes and syphon / fire down to assembly portion mid fall or at impact (insta forge / temper).

Ideally remote robots are controlled by human soldiers, who’s movements are tracked in some manner with motion capture technologies and movement played out by the remote robot device.  it seems to make sense that they have the same gross morphology of a human – in size or to be able to hold items requested for.

I do not know how much the militaries and researchers of this world practice control of disembodied control systems where the actions of the controller do not correspond well with outcome.  i guess its like learning to use a touchscreen or video game controller, but with the whole body.  that certainly would be facilitated by having a remote unit that would map well to human size and shape in a human world to interact with – but special purpose systems might be far better suited without such constraints.

in the case of our remote robots being driven by humans far away streaming commands to them via their 3d war printing group above…  Imagine a kinect like system (or something other than – bubble balls or fancy, clever ways of looking at IR lighting etc.) which would be programmed specifically to track a person, who might wear clothing which might assist in accuracy of tracking.

If trained the soldier could use their hands around a lathe or series of tools to signal the type of thing it may desire the airborn factory to produce in real time as a tool for immediate use.

For example, swiveling the wrist using the hand to form a tube and then extending the distance between arms making this movement a significant length might be captured as the intention of drawing a spear.  The machine can lathe in real time.  The soldier might lick their lips or make a sharp cut motion to indicate where on the shape just carved in air should be sharp, or explosive, or projectile launching.  there can be many such command motions.

One might spin their forearm to form a loop, and indicate fill by opening and closing the palm.  Complex movements can be captured from wrist view, again with concept of rapid lathe.  the limit of construction speed on an item in this manner would be the skill of the user, the recognition of intent by software, accuracy of capture of information and rapidity and variablility of construction while still maintaining near real time output.

you’d need macros so to speak, or spells if you prefer.

Like pottery I guess.  But whats the point?

This soldier can craft in real time.  And the requests would be delivered as rapidly as possible.   Ideally by firing a capsule right at the human / remote robot to use the output (or intended target by pointing or indicating projectile trajectory to a weapon device immediately before throwing it – guided grenade with small rocket for propulsion after following soldiers intended trajectory / fling example recording (or track in real time, throw itseld or other command indicates curve, angle, power, time delayed thrust etc.).

If solid tool or weapon directly dropping the item if sufficiently resilient.

The robot proxy or human is now able to supply itself with tools, repair modules etc.

This could be done by voice or other command input vectors of course.

But one needs ones voice to sing  or speak, and those words can be command or strategy and intent signalling to machine assistants as with the motion.

Its interesting to think they’ll probably get around to making this if they’ve not already – this cannot be really be stopped if you accept the preposition that they will create it in the cry of defense once the tech is face frontal.  2000$ home 3d gun printers tell me it is..  we’ll need to sell blueprint originals and limited license runs of items if we are to obtain control of manufacture… interesting times ahead when we have replicators (!lol! this is pretty awesome that it could and is occuring in our life times).

We need better nerve interface.  We have to speak and spell right now.  its too slow.

go in through the eye – its thick and you don’t necessarily kill them if you mess up.  ideally we’d build an upgradeable port – a sheet for nerve interfacing we might put small rods in and send charge down, like they do in the pictures of artificial eye implants, that may be biodegradable over a few weeks or years.  you’d think if they were further along with that then they let on publically they would not be stingy with the tech.  you’d hope.  being blind must be awful.