> Am I correct in my assumption that you want to basically create a much > more powerful Squeak/eToys (at least, the website gave me the > impression)?Я балдею от этих мужиков. Они хотят сделать пластилиновый язык -- но не на уровне синтаксиса (это само собой!), а на уровне виртуальной машины! Непейвода неоднократно писал, что а) стили программирования несводимы друг ко другу, б) стили программирования при хорошем программировании нужны все. Ребята из FONC полностью подтверждают это высказывание, затем говорят, что все равно все эти стили работают на одном железе, затем задают себе вопрос -- насколько можно это железо сделать удобным для поддержки всех этих стилей, и какие навороты должны быть сделаны сверху над этим железом. А пока железа нет (оно -- в дальних планах), делают рефлексивную виртуальную машину (точнее, учитывая "железность" их замаха, рефлексивную виртуальную виртуальную машину) для многостилевого программирования. Крайне интересно, что там будет происходить.
This is one of the goals (Etoys more so than Squeak).
> This might seem like a stupid question, but what will this project > offer to make a programmer more productive? I mean, did you study > where code is wasted on "hacking" and what particular concepts will be > provided that make it easy to describe algorithms, formats, etc.?
Having all aspects of the language (syntax, semantics, pragmatics) and system (tools, UI, etc.) visible, understandable and malleable, should help. It's Mohammed and the mountain, particularly with syntax and deep semantics. I think we can make these mountains light enough that they will move a considerable distance towards us without risking them "falling on us to our destruction."
I agree entirely with everything you say about transformations between formats and representations. My belief is that by placing (or providing the linguistic tools to place easily) the format and representation of data and programs in the most advantageous form for a given expression of intent, many of the software shortcomings you mention can be addressed.
> What I didn't see, yet, is a way to create a bijective > description of a data format. For example, it could be possible to > describe a network protocol with a combination of the data format and > a finite state machine
That's a great example. If NFA states are really closures, with the closure function implementing the transition(s) by appending new closures to the 'epsilon' and/or 'next' sets, you can parameterise/conditionalise transitions and states by the current input symbol (a network packet header) and closure contents (session history). Building a TCP stack should be as easy as writing a regexp (or PEG or whatever) that describes the possible sequences of packets in a valid session (modulo transitions being more complex than simple 'match-and-append' and 'outgoing' states closing over all information relevant to them). It should be possible to build many protocols this way without having to introduce any 'global knowledge' into the data structures or algorithms.
I think this scales. If the mechanism is reentrant it can do both bottom-up and top-down matching. At the simpler end you can make regular expressions that run as efficiently as 'egrep'. At the more general end it looks surprisingly close to message passing (if selectors were closures there might even be an isomorphism to exploit).
The transformation of readable format onto executable accessors is orthogonal and understood. Making the description bijective is (I
think) just a matter of making descriptions that can be driven either top-down (to make ASTs) or bottom-up (to 'pretty-print' the ASTs, concisely, as legal 'sentences' for the rules that generated them)?
> (hopefully visually instead of as text)
I'd love to see this done without the visual metaphor obfuscating or limiting the expression of the data, algorithms and state machine (or any other mechanisms involved). Frankly, if I can describe the data format with ASCII art and then refer to the fields in the description transparently within my 'protocol grammar', I consider it a quantum leap forward compared to writing 'struct' and 'union' declarations.
> Maybe what we need is a simple
> way to transform data into different representations such that the > algorithms can always operate on the most appropriate representation.
Or have the data transform itself, dynamically, according to how it's being used? A single 'collection' type that occasionally internally reorganises itself (or employs other mechanisms for optimising a subset of the possible operations into constant time) as an array, stack, queue, list, dequeue, set, bag, (some kind of) tree, stream, etc., depending how you're using it at the moment. This is, I think, a hard problem and one well worth solving.
Заодно нашлись еще интересные слайды от Piumarta (март 2007г.): http://vpri.org/pipermail/jitblit/attac
UPDATE: вот только что пришло еще от Piumarta:
>> From what I understand, LOLA should eventually enable a developer to
> work effectively with all of the above syntax flavors (and many
> on top of a common object/execution model, without the fuss of SWIG
> nor the crushing weight of a JVM or CLR runtime.
That's the idea. We should be able to go beyond even domain-specific languages, to what I've been calling 'mood-specific languages'. If it makes my (e.g.) message-passing code more readable to be able to write 'x[y,z]' instead of '(x at: y) at: z' during a three-line region of my program in the middle of some function or method, I want to be able to instantiate my new syntactic convention for just those three lines of code. It'll certainly not look anything like this...
`push-syntax expr += expr-1[expr-2,expr-3]
-> ((expr-1 at: expr-2) at: expr-3) ;
c[i,k] = a[i,j] * b[j,k].
but the closer we can get to the spirit of the above, the better.
(IMO, the CLR's biggest mistake was that it wasn't written in itself.)
И еще сегодня от Piumarta:
Several of us here (at VPRI) are interested in the potential of better synergy between hardware and software. Something we'd love to see is the *OLA back-end generating netlists for FPGAs, or better still reprogramming them on the fly (which I'm told is possible, but I've yet to see an example). Just-in-time software deserves just-in- time hardware, no? ;-)