--nextPart1768583.1xlg4pmWGu Content-Type: text/plain; charset="utf-8" Content-Transfer-Encoding: quoted-printable Regarding the general idea of translatewiki.net-based translation workflow for KDE, as expressed here: https://translatewiki.net/wiki/Translating:KDE#Akademy_submission:_A_web-= based_translation_platform_for_KDE =46irst, it is great to continually experiment with new tools and workflows. But, any new workflow has to cleanly interface with existing mechanisms, in order to demonstrate its advantages and disadvantages without too much disruption. This is what most of this thread has been so far. Furthermore, I think it is reasonable to make it a requirement that one or few complete language teams switch to the new workflow, to ease technical conflicts, providing that no work will be lost if a team decides to abandon that workflow after some time. Under these two conditions -- clean interfacing, easy abandoning -- experimenting is just fine. Second, I think that all existing web-based translation systems, including translatewiki.net, are decisively flawed at the core. (At least when applied to free software translation.) The major motivating assumption that these systems take is this: the flatter the learning curve in translation-related tools, the more translations will be made (for the set quality level). I haven't seen this assumption backed up with numbers so far. As long as operating on gut feeling, my assumption is that "online" translators (those not willing to climb the "offline" tool curve) will not be able to match either the quantity (summed over all contributors) or the quality of translation, compared to "offline" translators. The major technical drawback of web-based translation systems is that they mix into one three distinct functions, that of translation hub, translation editor, and translation checker. The result is that they are less efficient and less capable in any of these functions compared to dedicated standalone tools. Furthermore, since the tool is monolithic and remote, translators don't even get a shot at experimenting themselves with mixing in other tools. There are several counter-points made to this technical drawback, none of which is very valid. One is that translators no longer have to worry about particular translation file formats, as the web interface presents it all in a uniform way. However, there are no "particular translation formats" in free software translations: there is pretty much only PO. That little which is not PO, is first converted into PO for translation. Another counter-point is that translators no longer have to deal with varying offline tools, such as particular VCS. This is also not true, because there are many web-based translation systems, so translators now have to deal with varying online tools. Finally, it is said "if you wish, just export to PO, translate offline, then import back". This means that for that translator, the web- based system reduces to translation hub only, and a very inefficient one compared to a VCS. I will now directly address Niklas' motivation points on the above page: > I've seen the various file formats Quantitatively, I don't think this is significant for free software. When a non-PO format is offered by the programmer, the best reply is "please switch to PO: it is the most supported format by free tools, as well as the most capable format on its own." > emailing files around It is easier to send by email than to upload. The idea here is that while it is harder to upload, it will be less of a burden to integrate, which means more translations being made. This, however, still awaits the quantitative proof. > digging version control history to find changes There are two elements here. One is finding the particular version of a message, or listing different past versions, and another is having a proper diff. Both these functions are poorly supported by line-based VCS, and the solution is to use a PO-aware tool atop of a VCS. There is no implication that this tool must be part of a bigger monolithic program, and much less that it must be web-based. > read the source code to understand the context =46or real quality translation, this is, and probably will always be, necessary. What can be done, is to equip the translation editor with good automatic source reference resolving and display. Again, there is no implication of a web-based tool here. > proofread translations on a mailing list (Didn't understand this, unless part of the following.) > and waited months for my translations to end up in the product because of > busy translation managers. The most review time is taken by -- review. Not by copying the file from whatever communication channel, not by opening it in editor, not by committing it. This is probably why I have been hearing the same complaint in context of web based tools ("I did it in Launchpad months ago, but noone approved it yet"). It is true that the review process could be made more efficient. But, in my opinion, the usual approach to this ("the approver") is conceptually the same regardless of the tool, and that is the main problem in an environment with fluid contributor time. =2D-=20 Chusslove Illich (=D0=A7=D0=B0=D1=81=D0=BB=D0=B0=D0=B2 =D0=98=D0=BB=D0=B8= =D1=9B) --nextPart1768583.1xlg4pmWGu Content-Type: application/pgp-signature; name=signature.asc Content-Description: This is a digitally signed message part. -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.12 (GNU/Linux) iEYEABECAAYFAlFa+aoACgkQMSGXgigGr3Hv4gCffbTD/zLjFlhnv/kPW03/LWhJ zSUAnRizyDM7GB/GnfTdofMyyHxKyvii =jWuJ -----END PGP SIGNATURE----- --nextPart1768583.1xlg4pmWGu--