Is War Now Post Kinetic?

rodneybrooks.com/is-war-now-post-kinetic/

When the world around us changes, often due to technology, we need to change how we interact with it, or we will not do well.

Kodak was well aware of the digital photography tsunami it faced but was not able to transform itself from a film photography company until too late, and is no more. On the other hand, Pitney Bowes started its transformation early from a provider of mail stamping machines to an eCommerce solutions company and remains in the S&P 500.

Governments and politicians are not immune from the challenges that technological change produces on the ground, and former policies and vote getting proclamations may lag current realities^{\big 1}.

I do wonder if war is transforming itself around us to being fought in a non-kinetic way, and which nations are aware of that, and how that will change the world going forward. And, importantly for the United States, what does that say about what its Federal budget priorities should be?

A Brief History of Kinetic War

The technology of war has always been about delivering more kinetic energy, faster, more accurately and with more remote standoff from the recipient of the energy, first to human bodies, and then to infrastructure and supply chains.

New technologies caused changes in tactics and strategies, and many of them eventually made old technologies obsolete, but often a new technology would co-exist with one that it would eventually supplant for long periods, even centuries.

One imagines that the earliest weapons used in conflicts between groups of people were clubs and axes of various sorts. These early wars were fought in close proximity, delivering kinetic blows directly to another’s body.

By about 4,400 years ago the first copper daggers appeared, and by 3,600 years ago, bronze swords appeared, allowing for an attack at a slightly longer distance, perhaps out of direct reach of the victim. Even today our infantries are equipped with bayonets on the ends of guns to deliver direct kinetic violence to another’s body through the use of human muscles. With daggers and swords the kinetic blows could be much more deadly as they needed less human energy to cause bleeding.

Simultaneously the first “stand off” weapons were developed; bows and arrows 12,000 years ago, most likely with a very limited range. The Egyptians had bows with a range of 100 meters a little less than 4,000 years ago. A bow stores the energy from human muscle in a single drawing motion, and then delivers it all in a fraction of a second. These weapons did not eliminate hand to hand combat, but they did allow engagement from a distance. With the introduction of horses and later chariots, there was added the element of speed of closing from too far away to engage to being in engagement range very quickly. These developments were all aimed at getting bleed-producing kinetic impacts on humans from a distance.

A little less than 3,000 years ago war saw a new way to use kinetic energy; thermally. No longer was it just the energy of human muscles that rained down on the enemy, but that from fire. First from burning crops, but soon by delivering  burning objects via catapults and other throwing devices. Those throwing devices started out just delivering heavy weights, though the muscle energy of many people stored over many minutes of effort. But once burning objects were being thrown they could deliver the thermal energy stored in the projectile, as well as unleash more thermal energy by setting things on fire in the landing area.

During the 8th to 16th century, hurled anti-personnel weapons, those aimed at individual people, were developed where projectiles full of hot pitch, oil, or resin, were thrown by mechanical devices, again with stored human energy, intended to maim and disable an individual human that they might hit.

The arrival of chemical explosives ultimately changed most things about warfare, but there was a surprisingly long coexistence with older weapons. The earliest form of gunpowder was developed in 9th century China, and it reached Europe courtesy of the Mongols in 1241. The cannon, which provided a way of harnessing that explosive power to deliver high amounts of kinetic energy in the form of metal or stone balls provided both more distant standoff and more destructive kinetics, and was well developed by the 14th century, with the first man portable versions coming of age in the 15th century.

But meanwhile the bow and arrow made a come back, with the English longbow, traditionally made from yew (and prompting a European wide trade network in that wood), having a range of 300 meters in the 14th and 15th centuries. It was contemporary with the cannon, but the agility of it being carried by a single bowman led to it being the major reason for victory in a large scale battle as late as the Battle of Agincourt in 1415.

The cannon changed the nature of naval warfare, and naval warfare itself was about logistics and supply lines, and later being a mobile platform to pound installations on the coast from the safety of the sea. Ships also changed over time due to new technologies for their propulsion, from oars, to sails, to steam, and ultimately to nuclear power, making them faster and more reliable. Meanwhile the mobile cannon was developed into more useful sorts of weapons, and with the invention of bullets (which combined the powder and projectile into a compact pre-manufactured expendable device), guns and then machine guns became the preferred weapon of the ground soldier.

Each of these technological developments improved upon the delivery of kinetic energy to the enemy, over time, in fits and starts making that delivery faster, more accurate, more energetic, and with more distant standoff.

Rarely were the new technologies adopted quickly and universally, but over time they often made older technologies completely obsolete. One wonders how quickly people noticed the new technologies, how they were going to change war completely, and how they responded to those changes.

Latter Day WAR

In the last one hundred or so years, from the beginning of the Great War, also known as World War I, we have seen continued technological change in how kinetic energy is delivered during conflict. In the Great War we saw both the introduction of airplanes, originally as intelligence gathering machine conveyances, but later as deliverers of bullets and bombs, and the introduction of tanks. Even with mechanization, the United Army still had twelve horse regiments, each of 790 horses, at the beginning of World War II. They were no match for tanks, and hard to integrate with tank units, so eventually they were abolished.

By the end of World War II we had seen both the deployment of missiles (the V1 and V2 by Germany), and nuclear weapons (by the United States). Later married together, nuclear tipped missiles became the defining, but unused, technology that redefined the nature of war between superpowers. Largely that notion is obsolete, but North Korea, a small poor country, is actively flirting with it again these very days.

Another innovation in World War II, practiced by both sides, was massive direct kinetic hits on the civilian populations of the enemy, delivered through the air. For the first time kinetic energy could be delivered far inside territory still held by the enemy, and damage to infrastructure and morale could be wrought without the need to invade on the ground. Kinetically destroying large numbers of civilians was also part of the logic of MAD, or Mutually Assured Destruction, of the United States and the USSR pointing massive numbers of nuclear tipped missiles at each other during the cold war.

Essentially now war is either local engagements between smaller countries, or asymmetric battles between large powers and smaller countries or non-state actors. The dominant approach for the United States is to launch massive ship and air based volleys of Tomahawk Cruise Missiles, with conventional kinetic war heads, to degrade the war fighting infrastructure in the target territory, and then boots on the ground. The other side deploys harassing explosives both as booby traps, and to target both the enemy and local civilians through using human suicide bombers as a stand off mechanism for those directing the fight. As part of this asymmetry the non-state actors continually look for new ways to deliver kinetic explosions on board civilian aircraft which has had the effect of making air travel worldwide more and more unpleasant for the last 16 years.

In slow motion each class of combatant changes their behavior to respond to new, and past, technologies deployed or threatened by the other side.

But over the whole history of war, rulers and governments have had to face the issue of what war to prepare for and where to place their resources. When should a country stop concentrating on sources of yew and instead invest more heavily in portable cannons? When should a country give up on supporting regiments of horses? When should a country turn away from the ruinous expense of yet higher performance fighter planes whose performance is only needed to engage other fighter planes and instead invest more heavily in cruise missiles and drones with targeted kinetic capabilities?

How should a country balance its portfolio of spending on the old technologies of war, and putting enough muscle behind the new technologies so that it can ride up the curve of the new technology, defending against it adequately, and perhaps deploying it itself.

BUT HAS A NEW FORM OF WAR ARRIVED?

In the late nineteenth century fortunes were made in chemistry for materials and explosives. In the early part of the twentieth century extraordinary wealth for a few individuals came from coal, oil, automobiles, and airplanes. In the last thirty years that extraordinary wealth has come to the masters of information technology through companies such as Microsoft, Apple, Oracle, Google, and Facebook. Information technology is the cutting edge. And so, based on history, one should expect that technology to be where warfare will change.

Indeed, we saw in WW II the importance of cryptography and the breaking of cryptography, and the machines built at Bletchley Park in service of that gave rise to digital computers.

In the last few years we have seen how our information infrastructure has been attacked again and again for criminal reasons, with great amounts of real money being stolen, solely in cyberspace. Pacifists^{\big 2} might say that war is just crime on an international scale, so one should expect that technologies that start out as part of criminal enterprises will be adopted for purposes of war.

We have seen over the last half dozen years how non-state actors have used social media on the Internet to recruit young fighters from across the world to come and partake in their kinetic wars where those recruiters reside, or to wage kinetic violence inside countries far removed physically from where the recruiters reside. The Internet has been a wonderful new stand off tool, allowing distant ring-masters to burrow in to distant homelands and detonate kinetic weapons constructed locally by people the ring-masters have never met in person. This has been an unexpected and frightening evolution of kinetic warfare.

In the early parts of this decade a malicious computer worm named Stuxnet, most probably developed by the US and Israel, was deployed widely though the Internet. It infected Microsoft operating systems, and sniffed out whether they were talking to Siemens PLCs (Programmable Logic Controllers), and whether they were controlling nuclear centrifuges. Then it slowly degraded those centrifuges while simulating reports that said all was well with them. It is believed that this attack destroyed one fifth of Iran’s centrifuges. Here a completely cyber attack, with standoff all the way back to an office PC, was able to introduce a kinetic (slow though it may have been) attack in the core of an adversary’s secret facilities. And it was aimed at the production of the ultimate kinetic weapon, nuclear bombs. War is indeed evolving rapidly.

But now in the 2016 US presidential election, and again in the 2017 French presidential election we have seen, and all the details are not yet out, a glimpse of a future warfare where kinetic warfare is not used at all. Nevertheless it has been acts of war. US intelligence services announced in 2016 that there had been Russian interference in the US election.  The whole story is still to come out, but in both the US and French elections there were massive dumps of cyber-stolen internal emails from one candidate’s organization, timed exquisitely in both cases down to just a few minutes’ window of maximum impact. This was immediately, minutes later, followed by seemingly unrelated thousands of people looking through those emails claiming clues to often ridiculous malevolence. In both elections the mail dumps included faked emails which had sinister interpretations, uncovered by the armies of people looking through the emails for a smoking gun. These attacks most probably changed the outcome of the US election, but failed in France. This is post kinetic war waged in a murky world where the citizens of the attacked country can never know what to believe.

Let us be clear about the cleverness and monumental nature of these attacks. An adversary stands off, thousands of miles away, with no physical intrusion, and changes the government of its target to be more sympathetic to it than the people of the target country wanted. There are no kinetic weapons. There are layers of deception and layers of deniability. The political system of the attacked country has no way to counteract the outcome desired and produced by the enemy. The target country is dominated by the attacking adversary. That is a successful post kinetic war.

Technology changes how others act and how we need to act. Perhaps the second amendment to the US Constitution, allowing for an armed civilian militia to fight those who would destroy our Republic, is truly obsolete. Perhaps the real need is to equip the general population of the United States with tools of privacy and cyber security, both at a personal level, and in the organizations where they work. Just as WW II showed the obsolescence of physical borders to protect against kinetic devices raining from the sky, so too now we have seen that physical borders no longer protect our fundamental institutions of civil society and of democracy.

We need to learn how to protect ourselves in a new era of post kinetic war.

We see a proposed 2018 US Federal budget building up the weapons of kinetic war way beyond their current levels. Kinetic war will continue to be something we must protect against–it will remain an avenue of attack for a long time. We saw above how the English long bow was still a credible weapon, coexisting with cannon and other uses of gun powder for centuries, though now its utility is well gone.

However, we must not give up worrying about kinetic war, but we must start investing in strength and protection against a new sort of post kinetic war that has really only started in the last twelve months. With $639B slated for defense in the proposed 2018 budget, and even $2.6B for a border fence, surely we can spend a few little billions, maybe even just one or two, on figuring out how to protect the general population from this newly experienced form of post kinetic war. I have recommendations^{\big 3}.

We don’t want the United States to have its own Kodak moment.



^{\big 1}For instance, in just six months from this last October to April, more jobs were lost in retail in the US than the total number of US coal jobs. Not only did natural gas, wind, and solar technology decimate coal mining, jobs never to return, but information technology has enabled fulfillment centers, online ordering, and delivery to the home, completely decimating the US retail sector, a sector that is many times bigger than coal.

^{\big 2}I do not count myself as a pacifist.

^{\big 3}Where in the Federal Government should such money be spent? The NSA (National Security Agency) has perhaps the most sophisticated group of computer scientists and mathematicians working on algorithms to wage and protect against cyber war. But it is not an agency that shares that protection with the general population and businesses, just as the US Army does not protect individual citizens or even recommend how they should protect themselves. No, the agency that does this is NIST, the National Institute of Standards and Technology, part of the Department of Commerce.  It provides metrology standards which enable businesses to have a standard connection to the SI units of measurement.  But it also has (with four Nobel prizes under its belt) advanced fundamental physics so that we can measure time accurately (and hence have working GPS), it has been a key contributor, through its measurements of radio wave propagation. to the 3G, 4G, and coming 5G standards for our smart phones, and it is contributing more and more to biological measurements necessary for modern drug making.  But for the purpose of this note its role in cybersecurity is omni important. NIST has provided a Cybersecurity Framework for businesses, now followed by half of US companies, giving them a set of tools and assessments to know whether they are making their IT operations secure. And, NIST is now the standards generator and certifier for cryptography methods.  The current Federal budget proposal makes big cuts to NIST’s budget (in the past its total budget has been around $1B per year).  Full disclosure: I am a member of NIST’s Visiting Committee on Advanced Technology (VCAT). That means I see it up close. It is vitally important to the US and to our future. Now is not the time to cut its budget but to support it as we find our way in our future of war that is post kinetic.

Patrick Winston Explains Deep Learning

rodneybrooks.com/patrick-winston-explains-deep-learning/

Patrick Winston is one of the greatest teachers at M.I.T., and for 27 years was Director of the Artificial Intelligence Laboratory (which later became part of CSAIL).

Patrick teaches 6.034, the undergraduate introduction to AI at M.I.T. and a recent set of his lectures is available as videos.

I want to point people to lectures 12a and 12b (linked individually below). In these two lectures he goes from zero to a full explanation of deep learning, how it works, how nets are trained, what are the interesting problems, what are the limitations, and what were the key breakthrough ideas that took 25 years of hard thinking by the inventors of deep learning to discover.

The only prerequisite is understanding differential calculus. These lectures are fantastic. They really get at the key technical ideas in a very understandable way. The biggest network analyzed in lecture 12a only has two neurons, and the biggest one drawn only has four neurons. But don’t be disturbed. He is laying the groundwork for 12b, where he explains how deep learning works, shows simulations, and shows results.

This is teaching at its best. Listen to every sentence. They all build the understanding.

I just wish all the people not in AI who talk at length about AI and the future in the press had this level of technical understanding of what they are talking about. Spend two hours on these lectures and you will have that understanding.

At YouTube, 12a Neural Nets, and 12b Deep Neural Nets.

Adding to the Alphabet of Life

rodneybrooks.com/adding-to-the-alphabet-of-life/

There was a really important scientific result reported on this week^{\big 1} in the press. The original paper^{\big 2}, by a team at Scripps Research Institute in La Jolla, CA, a person in Grenoble, France, and a person in Henan, China, is behind a paywall at the National Academy of Science.

This team had previously introduced a new, unnatural base pair (UPB) into the DNA of an organism based on E. coli. In the past it had caused some toxicity to the organism and also tended to get deleted during reproduction.  The new result is that they synthetically modified the organism, getting rid of the toxicity, and showed that the UBP could survive 60 generations of reproduction.

Here is what normal DNA (deoxyribonucleic acid) looks like (from Wikimedia Commons):

There are two backbone chains, left and right, of alternating 2-deoxyribose and phosphate molecules joined by complementary pairs of nucleotide pairs of either Adeline (A) and Thymine (T) or of Guanine (G) and Cytosine (C).  So reading down the left side of this fragment of DNA we have the code ACTG, and reading up the right side we have CAGT.

There are lots of mechanisms about DNA and RNA that are not fully understood still, but DNA is used for two purposes.  The letters on it encode genetic sequences which are used to construct proteins (it gets more complex every decade as we understand more), the stuff of life, and it is used to make copies of itself so that one copy can remain in a parent cell and another copy goes to a new child cells.

For producing proteins the two strands or backbones are pried apart with a molecular machine moving along it, and and RNA molecule is built with complementary base pairs for sub-length of the DNA. RNA (ribonucleic acid) looks like this, with just one backbone chain where ribose (which has five Oxygen atoms rather than the four of deoxyribose) molecules and phosphate molecules alternate and single bases, one of the four letters, hang off at regular intervals.

The process of producing this RNA in this way is know as transcription.  It then gets translated by another mechanism into amino acids which are linked together to produce proteins.  In all life on earth the series of letters is used three at a time (which means 64 possible combinations of the four letters 64 = 4\times 4\times 4) of which in the “standard” setting 61 of the codings select for one of 20 amino acids, and the remaining three codings are used to say stop.  These 64 cases can easily be written down as a table for all the possible three letter sequences (which themselves are known as codons).  There are currently close to 30 (numbers change all the time…) variations on this code found in life on Earth–for instance vertebrates, invertebrates, and yeasts, each use their own slightly different version of the table in translating the DNA in the mitochondria of their cells, coding for a total of 23 amino acids (I think…)

But here is a thing one; since 1990 people have done experiments  where they have modified simple organisms to change the meanings of some codons to produce amino acids (there are many of them known in nature) which are not coded for in any natural system.  We will come back to this.

The second thing that happens  to DNA is in reproduction and that works as follows.  The double stranded DNA is fed into a little molecular machine which  unzips it where the base pairs join, and then lets a complementary base and newly constructed backbone attach to each half of the DNA, spitting out, in a continuous fashion two copies of the original DNA, where each copy has half of the actual atoms of the original.

Now what does this new paper do?  It has added a new pair of bases to an E. coli genome, and built a version of E. coli where that reproduction mechanism for DNA handles the new letters well, and where they existence of the new letters causes no real harm to the cell.

We can call the new bases by the letters X and Y, though as you can see from this diagram they have longer names.  This is figure 1A from the paper:

At the top we see a standard Cytosine-Guanine pair, and below that two variations of X and Y (the same X in the two cases) pairings.  In this later paper they have shown that they can build a robust semi synthetic organism that carries these X and Y letters in the DNA, and preserve those letters well over at least 60 generations–that means at least 60 consecutive zippings apart and copying of the DNA including the X’s and Y’s.  In one variation they experiment with all 16 possible three letter sequences which have X in the middle and one of the regular G, A, C, or T on either side.  They state that the “loss was minimal to undetectable in 13 of the 16 cases”.

For my commentary below lets call this thing two.  We have now seen unnatural base pairs in a living organism being reproduced reliably.

Now the next thing that one imagines these scientists must be excited about is getting the transcription mechanism to handle the new letters, and then expanding the translation table from 64 entries to some bigger number.  The theoretical maximum would be 216 = 6\times 6\times 6, though so far they have not shown any sequences that have X’s or Y’s adjacent to each other are preserved.  But let’s call this combined result of two mechanisms thing three.

Thing one and thing two have been demonstrated.  Thing three has not.

But why am I writing this post. It is because I think thing two is a big deal about what life elsewhere might look like.

There has been some debate over whether life everywhere might look at the molecular level just like life here on Earth. I.e., perhaps it is the case that there is only a one way to make life out the the chemistry that exists in our Universe (and we assume here for argument’s sake that chemistry is the same everywhere in the Universe though there is debate about that).

We already thought, due to the multiple natural translation tables in Earth life, admittedly small variations on each other, but also that thing one had been done and varied them further that it might be reasonable to expect life, if we ever find it, elsewhere in the Solar System of further afield, to have different translation tables. In fact that has been a key question if we were to find life on Mars. If it has the same translation tables as on Earth we might presume that both forms of life came from the same place, perhaps Mars.  We have identified many meteorites on Earth that were once part of Mars, blasted off the surface of Mars by a large impact and eventually falling to Earth millions of years later. Perhaps they brought life with them.  But if we found DNA-based life on Mars to have a very different translation table from that on Earth we would tend to think that the life had arisen twice independently.

Now with thing two having been demonstrated in this new paper we might expect DNA based life on Mars to be even more different than that on Earth, perhaps use]ing a different set of base pairs. Since we have XY and XY’ demonstrated in this paper, we could imagine that it is not such a big step to have life with none of GACT, but perhaps all based in XYZW, or PQRS, or perhaps IJKLMN. This opens up the possibilities mightily. It is no longer enough to assay samples from Mars for the four base nucleotides that we find on Earth and declare no life if we do not see them. Before we get ahead of ourselves however, we must wait for thing three to be demonstrated. But that will seal the fate of how we must look for life on Mars–in a much more expansive way.

Is there a thing four?  Yes, perhaps in another version of DNA/RNA based biology there are not three letters used for each amino acid.  In a simpler version there might be only two letters to determine a smaller number of possible amino acids, or in a more complex version four letters to determine a larger number.  The engineering challenges to modify Earth based life to perform this way are significant, so I would not expect to see that any time soon.  But it could have implications for life elsewhere.

Getting back to Earth biology people have been trying to understand how RNA and DNA showed up to make life anywhere. A fairly sure bet is that there were simpler mechanisms before the current mechanisms we see. Perhaps all that life got obliterated, competed away, by the much more stable RNA/DNA based life we see today. Or perhaps some of it is still hiding in isolated environments on Earth and we haven’t yet recognized it.

One hypothesis is that perhaps a much less stable form of life relied on the much simpler PNA (peptide nucleic acid) shown here, but using the same modern GACT.

This is a much simpler backbone and there are arguments that it could more easily have arisen spontaneously in the primordial soup, but it is not as stable as DNA for long term storage of genetic information.  People have been doing lab experiments for twenty years getting PNA with the standard GACT bases to interact with and transfer sequences with RNA and DNA.  There are independently arguments about how the redundant standard translation table (61 coding entries but only 20 different amino acids), could have evolved from a much simpler coding system.

I think thing two shows that we must be more expansive on what we believe the biochemistry of life elsewhere might be.

My own suspicion is that there is plenty of life out there that uses totally different coding systems, and totally different molecules than RNA and DNA.

And I am getting more and more convinced that our current tools for detecting life are “all the harder to see you with”!



^{\big 1}This particular story has some questionable wording in places. This is not an entirely new type of DNA. Rather it is completely conventional DNA but it carries a new pair of base nucleotides.

^{\big 2}Yorke Zhang, Brian M. Lamb, Aaron W. Feldman, Anne Xiaozhou Zhou, Thomas Lavergne, Lingjun Li, and Floyd E. Romesberg, A semisynthetic organism engineered for the stable expansion of the genetic alphabet, Proceedings of the National Academy of Science, www.pnas.org/cgi/doi/10.1073/pnas.1616443114

Research Needed on Robot Hands

rodneybrooks.com/research-needed-on-robot-hands/

This is a short piece I wrote for a workshop on what are good things to work on in robotics research.

One measure of success of robots is how many of them get deployed doing real work in the real world. One way to get more robots deployed is to reduce the friction that comes up during typical deployments. For intelligent robots in factories there are many sources of friction, some sociological, some financial, some concerning takt time, some concerning PLCs and other automation, but perhaps the most friction that can be attributed to a lack of relevant research results is the problem of getting a gripper suitable for a particular task.

Today in factories the most commonly used grippers are either a set of custom configured suction cups to pick up a very particular object, or one of a myriad of parallel jaw grippers varying over a large number of parameters, and custom fingers, again carefully selected for a particular object. In both cases just one grasp is used for that particular object. Getting the right gripper for initial deployment can be a weeks long source of friction, and then changing the gripper when new objects are to be handled is another source of friction. Furthermore, grip failure can be a major source of run time errors.

Human hands just work. Give them an object from a very wide class of objects and they grip that object, usually with a wide variety of possible grips. They sense when the grip is failing and adjust. They work reliably and quickly.

Building more general hands for robots that require very little customization, that can dynamically grasp millions of different sized and shaped objects, that can do so quickly, that have a long lifetime over millions of cycles, and that just work would have significant impact on deployment of robots in factories, in fulfillment centers, and in homes.

Things like SLAM took many hundreds of researchers working for many years with an ultimately well defined problem (that definition took a few years to appear), and with access to low cost robots that could be used to produce dynamic data sets in many different environments.

Right now it is hard to define a mathematical criterion for a good robot hand, i.e., we can see nothing, and may never see anything, of comparable clarity as we had for SLAM.

My strawman is that we will need concurrent progress in at least five areas, each feeding off the other, in order to come up with truly useful and general robot hands:

– new (low cost) mechanisms for both kinematics and force control
– materials to act as a skin (grasp properties and longevity)
– long life sensors that can be embedded in the skin and mechanism
– algorithms to dynamically adjust grasps based on sensing
– learning mechanisms on visual/3D data to inform hands for pregrasp

I think progress on one of these alone is hard to get adopted by research groups working on others. The constraints between them are not well understood and need to be challenged and adapted to by all the researchers. This is a tall order. This is why grippers on factory robots today look just like they did forty years ago.