00:03
<Philip`>
Does it actually add it to the DOM, or does the innerHTML getter synthesise it for output?
00:04
<Philip`>
(e.g. adding it just before it emits a tag that uses that namespace)
00:46
<Lachy>
Philip`, I don't know how it works in detail. But I don't believe it's possible to see it via any other DOM methods. But that's probably because IE doesn't support the ProcessingInstruction node interface
02:15
<jcranmer>
falling asleep on a bus takes about ten minutes for me
02:15
<jcranmer>
er, wrong channel
02:16
<takkaria>
good to know, though. :)
02:17
<jcranmer>
(related to the rock slide on the main Whistler-Vancouver highway)
02:19
<jcranmer>
I now get to get on a bus at midnight, catch a flight at noon, and getting to my home airport at 10 PM (assuming no problems), to get home closish to 11
04:14
<Hixie>
we crossed the 2000 pending e-mail barrier! only 1933 e-mails left!
04:45
<Hixie>
i ordered something form amazon which shipped and went to the USPS' HEBRON KY facility
04:45
<Hixie>
and then stopped
04:45
<Hixie>
it's been there for days now
05:05
Hixie
chuckles at daniel's e-mail
05:05
<Hixie>
i'm glad i'm not the only one who had that reaction :-)
10:08
<zcorpan>
LOL
10:09
<zcorpan>
http://www.w3.org/mid/Pine.LNX.4.62.0807310535280.3295⊙hdc
10:09
<zcorpan>
Hixie: do i always do that? :P
10:11
<Hixie>
wow, someone actually read to the end :-D
10:12
<zcorpan>
heh
10:49
<MikeSmith>
Hixie: do you have online or in subversion the source for the W3C version of the header for the HTML5 spec?
10:49
<MikeSmith>
I want to try to just give you a patch with the proposed additions
10:49
<MikeSmith>
if that'll work for you
10:50
<MikeSmith>
otherwise, I can just from the Overview.html that checked into dev.w3.org CVS
10:51
<Hixie>
i'd rather just get the list of changes you want, but if you really want to do a patch, the source file is http://www.whatwg.org/specs/web-apps/current-work/header-w3c
10:51
<MikeSmith>
OK
10:51
<MikeSmith>
I can just send you the list too
10:52
<Hixie>
bed time now, anyway :-)
10:52
<Hixie>
nn
10:52
<MikeSmith>
night
10:53
<MikeSmith>
hsivonen: you around? I wanted to ask if the sources for the schemas you use for validator.nu are static sources you maintain manually or if you have build that uses Petr Nalevka's schemas in some way.
10:54
<MikeSmith>
The reason I ask is that I remember when I reported the error with the <ins> content model, you made a change in http://svn.versiondude.net/whattf/validator/trunk/schema/xhtml10/edit.rnc
10:56
<MikeSmith>
hmm.. or does validator.nu actually use the http://svn.versiondude.net/whattf/syntax/trunk/relaxng schemas at all?
10:59
<MikeSmith>
oops, I asked because I saw "svn log" was telling me no changes had been made to the revision.rnc file since January
10:59
<MikeSmith>
but I see the content model was already correct in there
11:00
<MikeSmith>
but then, I guess if validator.nu had actually been using that instead of Petr's schema, I would not have observed the error I got
11:17
<hsivonen>
MikeSmith: the preset schemas are maintained by me manually.
11:17
<MikeSmith>
hsivonen: I see
11:17
<hsivonen>
and yes, http://svn.versiondude.net/whattf/syntax/trunk/relaxng are used for HTML5
11:21
<MikeSmith>
hsivonen: I'm confused about why I ran into that error in spite of the http://svn.versiondude.net/whattf/syntax/trunk/relaxng having had the correct content model at that time
11:21
<MikeSmith>
as far as I can see
11:22
<hsivonen>
MikeSmith: was the error in HTML5 or XHTML 1.0?
11:22
<hsivonen>
(afk)
11:22
<MikeSmith>
ah, I guess was checking XHTML 1.0 source
11:22
<MikeSmith>
that had to be what it was
13:50
<hsivonen>
Philip`: what headers do I need to show to wikimedia servers in order to be able to download stuff?
13:52
<Philip`>
hsivonen: I think I might have just had to remove wget's User-Agent header
13:55
<hsivonen>
hmm. I wonder how to compute the /0/0b/ bit: http://upload.wikimedia.org/wikipedia/commons/0/0b/2001-5_Gubernatorial_election_map.svg
13:55
<Philip`>
First two hex digits of the MD5 of the filename, if I remember correctly
13:56
<hsivonen>
as UTF-8 I presume?
13:56
<Philip`>
I didn't worry about that :-)
13:57
<Philip`>
$ echo -n '2001-5_Gubernatorial_election_map.svg' | md5sum
13:57
<Philip`>
0b4c683b9c7a33f7178e7276e49c261e -
13:57
<Philip`>
(That seems about right)
13:57
<hsivonen>
and is /0/ always /0/?
13:57
<Philip`>
That's the first hex digit of the MD5
13:58
<Philip`>
and the /0b/ is the first two
13:58
<hsivonen>
Philip`: thanks
13:58
<hsivonen>
it appears that concocting a non-wget UA string doesn't work, but using a browser-looking string works...
13:59
<Philip`>
What about no UA string at all?
13:59
<hsivonen>
Philip`: 403
13:59
<Philip`>
Oh, okay
14:00
<Philip`>
(I think my download script was just a line of bash and it's not in my command history and it's not saved anywhere else)
14:27
<hendry>
how do you specify required/mandatory in the w3c webidl? a comment?
14:31
<Lachy>
hendry, in what context?
14:32
<hendry>
Lachy: for e.g. http://dev.w3.org/geo/api/spec-source.html#position latitude, longitude are mandatory, whilst heading isn't
14:34
<Lachy>
I think you have to specify that in the prose, where you define how to create the object, by specifying what each value must be set to in different circumstances
14:36
<hendry>
Lachy: do you have an example?
14:38
<zcorpan>
hendry: why not just allow it to always be null?
14:38
<Lachy>
well, it currently says "... If the implementation cannot provide heading information, the value of this attribute must be null." That seems fine to me
14:39
<Lachy>
but it doesn't define what to do if either the latitude or longditude isn't available.
14:40
<hendry>
just for extra clarity i thought it could be in the IDL
14:40
<hendry>
as well as the "prose" / text below
14:45
<Lachy>
hendry, in the section just above that for the PositionOptions interface, it says "The errorCallback and enableHiqhAccuracy attributes are optional." It's not clear to me what that means, nor where the errorCallback attribute is defined
14:46
<Lachy>
unless you're referring to the errorCallback parameter used in other methods defined earlier in the spec
14:47
<Lachy>
hmm, I guess I should review that spec in more detail one day
14:47
<hendry>
Lachy: sooner the better. Web 3.0 is waiting my man
14:48
<Lachy>
are you the editor of the spec?
14:48
<hendry>
Lachy: no
14:48
<Lachy>
oh, I thought by the way you were asking, you were wondering how you could improve the spec.
14:50
<hendry>
yes, of course i want to improve it :) i want it done yesterday
14:53
gsnedders
wonders whether to be a pedantic asshole
14:53
<gsnedders>
"the text of header elements" — what's "the text"?
14:53
<Lachy>
gsnedders, you are always like that
14:53
<Lachy>
;-)
14:54
<gsnedders>
Lachy: No, almost always :)
14:54
<Lachy>
gsnedders, it may refer to the text content
14:54
<Lachy>
but it's difficult to know without a bit more context
14:54
<hsivonen>
wow. generating namespace prefixes without bugs is more complex than I had thought
14:55
<gsnedders>
Lachy: Look at 4.3.7 The header element
14:58
<Lachy>
gsnedders, I assume it's referring to the text that would be used for the purpose of creating a document outline. But if it's not used in the outline algorithm, then I don't know what it's for
14:58
<Lachy>
hsivonen, why are you trying to generate namespace prefixes?
14:58
<gsnedders>
I suppose for giving the header of the outline
14:58
<gsnedders>
But for the sake of numbering, do I number that header?
14:59
<hsivonen>
Lachy: because using a serializer written by someone else and left unmaintained sucks too
14:59
<Lachy>
what do you mean by the header of the outline?
15:00
<gsnedders>
s/outline/section/
15:00
<Lachy>
hsivonen, why is the content you're serialising using namespaces?
15:01
hendry
wonders if there is a JS graphing library that could render something like this in canvas or SVG http://port70.net/~nsz/dwm-sloc.png
15:01
<hsivonen>
Lachy: If I write an XML serializer, I write one that works in the general case
15:01
<Lachy>
oh, ok
15:02
<hsivonen>
anyway, it probably surprises no one, but getting namespaces right is by far the hardest part of writing a SAX to XML serializer
15:03
<hsivonen>
Lachy: if I didn't make it general purpose, and later I'd use it in an unexpected way, Philip` could provoke Validator.nu to emit ill-formed output and I could be ridiculed as a bozo
15:03
<Lachy>
LOL
15:07
<zcorpan>
Hixie: the link on http://www.whatwg.org/mailing-list should be #commits not #commit
15:08
<zcorpan>
hsivonen: but everyone is a bozo anyway
15:10
<hsivonen>
Lachy: also, I want a serializer that doesn't leak the prefix choices from an XSLT program into the output
15:12
<gsnedders>
I wonder how complex it would be to take a CharMapML file and a binary string, and convert it to Unicode
15:13
<takkaria>
oh, WF2 integration in HTML5. that'd be nice
15:24
<Lachy>
hsivonen, then in such cases, where do you obtain the appropriate prefixes from, or do you just generate them randomly?
15:25
<Lachy>
gsnedders, what?
16:49
<Philip`>
hendry: Why JS? It seems easier to e.g. use Gnuplot's SVG output, if you just want a static graph in a scalable web-friendly format
16:50
<hober>
hendry: google's chart api could probably handle that
17:54
<takkaria>
Philip`: I'm amazed at how long a sort -R of dmoz urls is taking
17:55
<Philip`>
takkaria: How much RAM do you have?
17:55
<takkaria>
just the 2GB
17:55
<Philip`>
I think 'sort' uses temporary files as scratch space when it's processing a lot of data
17:56
<Philip`>
The first time I did this, I used Perl to slurp the entire file into an array in memory and then randomise it and print it out again
17:56
<Philip`>
which used rather a lot of RAM, but did actually work and didn't take forever (but wasn't particularly fast either)
17:56
takkaria
appears to only be using ~40% of RAM at the moment
17:57
<Philip`>
where "rather a lot" was a bit more than 2GB, if I remember correctly
17:57
<Philip`>
(on a machine with 4GB RAM, so that was alright)
17:57
<takkaria>
ouch
17:57
<takkaria>
I don't have any machines with RAM >2GB
17:58
<takkaria>
times like these it would be nice to have a shell account on a supercomputer
17:58
<Philip`>
You could trim the file down substantially, e.g. skip a random 99% of the lines, and then randomise the output from that
17:59
<Philip`>
(By the way, be careful to remove duplicate URLs, and be careful to unescape the RDF XML (which I forgot to do since I just parsed it with regexps :-( ))
18:01
<takkaria>
yeah, I did the replacements and uniq stuff
18:01
<takkaria>
I found a very efficient method of extracting the URLs was simply perl -i -pe 's/&quot;/"/g' uniq-urllist
18:01
<takkaria>
grep "&" uniq-urllist | grep -v "amp"
18:01
<takkaria>
perl -i -pe 's/&amp;/&/g' uniq-urllist
18:01
<takkaria>
er, that was the wrong thing to paste
18:01
<takkaria>
'grep "http://" content.rdf.u8 | cut -d\" -f2 >urllist' is what I meant
18:02
<takkaria>
then "grep ^http" again, to filter out the odd description/name which had an URL in it
18:03
<Philip`>
Just out of interest, do you find 'grep' to be massively slower than 'LANG=C grep'?
18:04
<Philip`>
That was a pain to me when dealing with large files, until I realised setting LANG made it much faster, probably because my default locale was doing UTF-8 decoding of every line or something
18:05
<takkaria>
LANG=c is faster, but only by about 1% or so
18:05
<takkaria>
my normal LANG is en_GB.UTF-8
18:07
<Philip`>
Hmm, I'm using en_GB.UTF-8 by default too, but it goes ~30x faster with LANG=c
18:07
<Philip`>
on at least the two computers (one Gentoo, one Fedora) I've tested it on
18:07
<takkaria>
I'm an Ubuntuite
19:47
<nzkoz>
hey guys, does anyone know the best way to get in contact with the validator.nu maintainer? I've been noticing strange results lately, just wondering if I should be reporting them somewhere
19:47
<takkaria>
nzkoz: the maintainer is hsivonen, he's here quite a lot
19:48
<nzkoz>
takkaria: thanks, I'll keep an eye out over the next few days, any idea what timezone he's in?
19:48
<takkaria>
european time
19:49
<takkaria>
possibly GMT+2 or GMT+3, I'm not sure
19:49
<nzkoz>
cool, same as me, so hopefully we overlap a bunch :)
20:23
<gsnedders>
nzkoz: UTC+3
20:24
<nzkoz>
cheers
20:37
<takkaria>
Philip`: I now have a system that does roughly what yours does, except without Java ;)
20:39
<takkaria>
I don't cache headers, mind, but I do use wget -m which should be roughly the same
20:40
<Philip`>
takkaria: Does it do it with parallelism too? :-)
20:40
<takkaria>
yup
20:41
<Philip`>
Does it get confused by infinitely long web pages? :-)
20:41
<takkaria>
nope
20:43
<takkaria>
I get the master page list, split it into files of 100 urls each, then write a make rule that takes each list, and runs wget with a quota
20:43
<takkaria>
so make -j 4 does all the work for me wrt parallelism
20:48
<Philip`>
Sounds quite nice, then :-)
20:48
<takkaria>
I'm reasonably happy with it
20:48
<Philip`>
Hmm, only 4?
20:48
<takkaria>
I'm on a 100kB/s downstream connection :\
20:49
<takkaria>
I'm just writing all the steps I did to get it up and running into a makefile so that you can run "make fetch", it'll grab the rdf list, grep, sed, uniq, sort, randomise, and fetch
20:49
<Philip`>
Ah, right
20:50
Philip`
likes university internet connections :-)
20:50
<takkaria>
I'm tempted to pop within wifi range of mine tomorrow and get the 2MB/s connection
20:51
<Philip`>
I think the best I've seen is something like 50Mbit/s
20:53
<takkaria>
speeds like that make me a little bit sad
20:53
<takkaria>
that said, I should have a4MB
20:53
<takkaria>
... a 4MB/s connection fairly soon
21:25
<gsnedders>
Why on earth is Time Machine claiming not enough space?
21:27
<gsnedders>
ah! the sparebundle is too small
21:27
<Hixie>
timemachine sparsebundles auto-grow, no?
21:27
<gsnedders>
Um, yeah. Can't you create a sparseBundle big enough, kthxbai.
21:28
<gsnedders>
Hixie: They have a predefined max. size, IIRC
21:28
<Hixie>
for me, time machine just grows it whenever needed
21:28
<Hixie>
having said that i have a whole bunch of other problems doing timemachine over the network
21:29
<gsnedders>
Hixie: If you open the sparseBundle, you can find it's capacity
21:29
<Hixie>
and there's a maximum it won't go over?
21:29
<Hixie>
maybe that _is_ the problem i've been seeing then
21:29
<gsnedders>
Yeah
21:32
<gsnedders>
Hixie: hdiutil resize -size max -growonly [path to sparseBundle]
21:32
<gsnedders>
Hixie: Just don't have it mounted ;)
21:32
<Hixie>
disk utility lets you change all that much easier :-)
21:32
<gsnedders>
Does it allow you do that with disk images?
21:32
<gsnedders>
Oh.
21:33
<Hixie>
:-)
21:33
gsnedders
shrugs
21:33
<gsnedders>
CLI is simple enough :)
21:33
Hixie
has disk utility on his dock, at home
21:33
<Hixie>
my mac mini is so slow, poor mini
21:35
gsnedders
has Terminal in his dock
21:35
<gsnedders>
That does everything :)
21:47
gsnedders
ought to pack before leaving tomorrow
21:47
<Hixie>
Terminal is the first thing I put in my Dock, always :-)
21:47
<Hixie>
usually, the only apps i have running are Terminal and 1 to 4 browsers
21:53
<gsnedders>
Hixie: Don't forget Finder!
21:53
<gsnedders>
:P
21:54
<Hixie>
i'd turn that off if i could!
21:57
<gsnedders>
Finder, Mail.app, Safari, Colloquy, SubEthaEdit, iTunes, and Terminal
22:11
<gsnedders>
Hixie: Re: <header>, if I'm numbering the section headers, where should the number go? The first highest ranking hx?
22:11
<Hixie>
yes
22:11
<Hixie>
though i would just treat <header> as implying no-toc no-num
22:11
<Hixie>
in a spec
22:12
<gsnedders>
meh. I don't really want behaviour like that anyway
22:12
<gsnedders>
by default I only do stuff from depth 2–6
22:13
<gsnedders>
I'll fix that for RC1
22:14
<gsnedders>
(I currently don't do anything special for header, more or less)
22:14
<gsnedders>
(in terms of it's text)
22:14
<gsnedders>
s/'//
22:15
<Hixie>
not doing anything special is fine too
22:15
<gsnedders>
well, I basically just treat it as if it were an hx element
22:15
<Hixie>
it's not like we use it; nor will we use it for anything but the actual <h1> header
22:15
<Hixie>
i expect
22:16
<gsnedders>
Which will lead to problems, I expect
22:16
<gsnedders>
But we don't use it yet, so who cares atm?
22:17
<Hixie>
indeed
22:20
<gsnedders>
And I want to get b1 out before I go on holiday, and probably offline for two weeks
22:20
<Hixie>
no rush
22:20
<Hixie>
probably don't want to switch to spec gen just before publication anyway
22:20
<Hixie>
which will be in a couple of weeks or so
22:20
<Hixie>
so that's fine
22:21
<Hixie>
?
22:22
<gsnedders>
It's a whole lotta notihng
22:22
<gsnedders>
*nothing
22:24
gsnedders
adds a new test, just 'cos it fails
22:24
<Philip`>
Uh
22:24
<Philip`>
My internet connection is pretty broken, so I was just randomly hitting keys into my dead SSH session, and it looks like they actually got transmitted
22:26
<Hixie>
hah
22:30
<gsnedders>
Hixie: svn.whatwg.org is down :\
22:31
<Hixie>
wfm
22:31
<Hixie>
no load or memory problems on the machine at the moment
22:31
<Hixie>
only minor nfs lag, nothing serious
22:31
<Hixie>
no unusual processes, either
22:33
<gsnedders>
working again now
22:34
gsnedders
gets to the thing which he doesn't want to document
22:34
<gsnedders>
Detection of w3c status
22:34
<gsnedders>
This took me a while to reverse engineer :P
22:35
<gsnedders>
(in part because I was mislead by the docs)
22:51
<gsnedders>
I probably should've started writing the ack section months ago
23:14
Hixie
adds a ten-day moving average to http://www.whatwg.org/issues/data.html for no apparently reason
23:23
gsnedders
doesn't really want to write, "normally, the normal string substitutions are…"
23:28
gsnedders
wonders how to avoid [DATE] being parsed
23:28
<gsnedders>
This is making it rather hard to document
23:29
<gsnedders>
Hixie: Is <!----> the shortest conforming comment?
23:30
<Hixie>
i believe so
23:30
<gsnedders>
Because although the text can't start with U+002D, the text is optional?
23:31
<Hixie>
right
23:31
<Hixie>
well
23:31
<Hixie>
iirc, the text isn't optional, but it can be the empty string
23:31
<Hixie>
which is the same thing!
23:32
<gsnedders>
Hixie: text is may
23:32
<gsnedders>
Hixie: Thus it isn't required
23:34
<Hixie>
ah ok
23:34
<Hixie>
well there you go then
23:34
<gsnedders>
(string subs must be contained within a single text node, thus having a comment that is empty stops it from being repalced)
23:34
<gsnedders>
*replaced