gabriel rosenkoetter on Sun, 16 Feb 2003 11:44:05 -0500


[Date Prev] [Date Next] [Thread Prev] [Thread Next] [Date Index] [Thread Index]

Re: [PLUG] how to lose your rights and freedom...


On Sat, Feb 15, 2003 at 04:01:22PM -0500, Bob Schwier wrote:
> The best encryption against the federal government would be seeming plain
> text.  If I were mounting such a campaign I would use codes, as has been
> done throughout history, that would seem to be innocent to someone reading
> over my shoulder.

Actually, that's one of the easiest systems to cryptanalyze. Granted,
it's got the attraction that you won't see it unless you know to
look for it, but once you're suspicious of one party and you start
slurping a good corpus of his mail through some pattern repetition
software, you'll come up with the fact that a code exists pretty
easily (there are some standards of frequency both of words and
letters in English prose that you would almost inevitably end up
violating when generating this kind of enciphered message on a
regular basis), and break that code easily with a few more samples.

This is easier to decipher than Enigma, though harder to notice.
It's (US) football quarterback-grade encryption.

> Obvious encryption would only get NSA's attention.
> They may not know what it says, but they do know that it means something
> and something I wish to keep hidden.

Right, and assume the NSA can break 1024-bit keys right now anyway.
(No, really, do assume that. I wouldn't use this DSA key for
anything I wanted to keep a government out of.)

So send your real payload subliminally in another enciphered
message, especially one saying Bad Things (don't commit any crime
in that text, of course).

Or, better than doing two messages in the same stream (and leaving
some tell-tale data around), merely encipher twice. Anyone who
doesn't know to decipher twice (even with the same key) will never
get anywhere, because the first deciphering will never produce
anything recognizable. (Though, make sure that making an extra pass
through your algorithm doesn't leak key data over time. It may.)

> I would suspect that that form of code is being transmitted as we speak
> with questions about "sister" or "brother" by al Qaida operatives.
> These guys are not fools and they know about command and control being
> compromised by broken codes.

Huh?

> I wish the right to codes to keep potential patent information protected
> or to keep my students from reading information I don't want them to know
> yet like their grades or information that they should not know like the
> grades of their fellows.

And consumer-grade encryption doesn't satisfy that wish?

> The guy who said "gentlemen do not read other gentlemen's mail" got fired
> a long time ago.  You should not say anything in your personal dispatches
> that you cannot tell the holy inquisitor to his face.

I guess. But what's Monty Python got to do with it?

On Sat, Feb 15, 2003 at 11:18:02PM -0500, Arthur S. Alexion wrote:
[...]
> his son.  We are living in very different times, and one party seems to
> be in control of all three branches of government.  So much for checks
> and balances.

This borders on political discussion inappropriate to PLUG (imho).
Those checks and balances weren't designed to balance between two
parties, they were designed to balance between three different
functions of government, and they still do that.

The problem is that the people's opinion is slowly being locked out
of the places where it should be listend to (by corporations
shouting louder, by politicians telling the people what they want
rather than the other way around, et cetera). And both major parties
are plenty guilty of that, and have been further back than 1973.

On Sun, Feb 16, 2003 at 08:33:51AM -0500, David Shaw wrote:
> On Fri, Feb 14, 2003 at 07:01:19PM -0500, gabriel rosenkoetter wrote:
> > This happens to be true for the exact format of PKI that OpenPGP
> > uses, but it's not generally true (not even generally true of PKI
> > systems).
> Not true for OpenPGP either.  You can have a different passphrase on
> your signing (sub)key than on your encryption (sub)key, even if they
> are the same "key" overall.

For RSA keys too? That makes sense to me in DSA (where, in fact,
you're using a different algorithm to sign than you are to encipher),
but I didn't think you could split RSA that way...

On Sat, Feb 15, 2003 at 06:49:05PM -0500, Paul wrote:
> Good idea.  I hope you're on our side!

Whose side of what?
 
> Chris Hedemark wrote:
> >Embed encrypted text inside of pictures, and post the pictures to 
> >bulletin boards.  The pictures should appear to be on-topic for the 
> >board in question.
> >
> >Stuff like Carnivore is only looking in email.

Incidentally, any steganographic posts on Usenet ARE liable to be
picked up, not by Carnivore (who needs Carnivore anyway? Usenet's a
commons!), but by the folks on cypherpunks trolling Usenet for
steganographic messages (to further the research of how to do
steganography).

On Sat, Feb 15, 2003 at 03:33:18PM -0500, Paul wrote:
> What if someone were to use a false signiture to contain a short, 
> encrypted message?  Who would know the difference, other than the people 
> who check your signiture?

That's using a subliminal channel, and quite an old idea (and can
easily be done without damaging the validity of the signature,
though doing so with OpenPGP would leave it obvious to anyone who
knew what he was talking about that there was more data than
necessary for the signature). There are some PKI systems constructed
this way, but I'd have to check my copy of Schneier to provide
references.

> Have you ever heard of "unlawful possession" or "possession...for an 
> unlawful purpose"?  For example, shooting people is illegal, and 
> carrying a gun is illegal because it is the tool that fires the bullet. 

Since when is carrying a gun illegal? Oh, you mean unlicensed:

> If you carry a gun you might intend to use it.  If not, to cover all 
> bases, the government has made possession illegal.  One law leads to 
> another until you have few, if any, options.  You can get a permit to 
> carry a gun.  The government decides if you can have it.  Even then, 
> there are very few circumstances in which you could actually use the gun.

Sure, and until late in the Clinton administration, it was illegal
to export strong crypto (where "strong" is a relative term... that
limitation was ludicrously low by the time it was removed).

My (and others') point is just that you only get nailed for
possession when you've done something else. Walking around flashing
an illegal gun MIGHT make a cop on the beat stop you, but illegal
guns (and illegal weapon charges) usually pop up when that gun is
*used* illegally.

The licensing concern is very real, but as I said the federal
government has already tried to force us to use a key escrow system
and failed utterly. They were still living in the 1960s when the
only way to do crypto was in hardware, and thought that if they
insisted that the only hardware-based crypto that could be sold was
the crypto system they dictated (that had a backdoor), then it was
all people would use. Unfortunately, crypto's just math, and
personal computers have gotten very good at doing arbitrary math
that the users tell them to.

That's why I say the cat's out of the bag: even a law that said I
could not use crypto in any way without its being illegal would be
completely unenforceable, since I can sit here and build up anything
I need from first knowledge. They're not restricting a product that
requires materials for manufacture and propulsion of ammunition; I
don't even need a computer to encipher something, I can do it on a
pad of paper. It's a fight the feds can't win, and they won't be
making more than feeble motions about it.

> Should anything with the potential to be used in a criminal act become 
[...]
> them now than to try to win them back later?

You're preaching to the choir (and I don't especially like to be
preached to).

On Sat, Feb 15, 2003 at 09:04:58PM -0500, Arthur S. Alexion wrote:
> The Pennsylvania crime known as PIC (Possession of an Instrument of
> Crime) does create a separate, additional, and independent crime and
> penalty for merely using an "instrument" to commit a crime.  I would bet
> that a smart prosecutor could argue that cryptography can be an
> "instrument of crime".

I'd be interested in that argument, because it's not an instrument
in the traditional sense. That is, perhaps you could be brought up
for using the computer, but I'm not sure you could for the
cryptography. Can I be accused under the instrument clause if I
happen know a good way to disable a burglar alarm with my bare
hands? Is that knowledge an instrument? If not, then why would
knowledge of cryptography be?

In response to the plausible argument that one can reduce having or
using a firearm to the knowledge to build one, I respond as I did
above: I need special materials to build a gun; I just need my brain
to do crypto.

-- 
gabriel rosenkoetter
gr@eclipsed.net

Attachment: pgpJ2RgUPnD5B.pgp
Description: PGP signature