[Estimated Reading Time: 2 minutes]

Whilst it’s nice to see Delphi in the news again in a context other than the Borland firesale, an interview with Wayne Williams recently posted on The Register is something of a mixed bag when it comes to content.  There is some confidence inspiring news but also disclosure of some rather puzzling priorities.  Specifically: “Williams says cross-platform is now a higher priority than a 64-bit compiler”.

The full interview is available here.

My own thoughts?

Well, I was not at all surprised at the comment that Embarcadero are not competing with Visual Studio and that VS is a .NET IDE.  I have believed for a long time that Microsoft effectively abandoning native code development tools was an open goal that Borland failed to capitalise on, and that their biggest mistake was trying to compete with Microsoft in the .NET tools space.

Time will yet tell whether Prism will fare any better for Embarcadero in this regard, but I see nothing but good sense in Embarcadero concentrating – in the main – on native code development tools.  But the statement that 64-bit is now a lower priority than cross-platform surprised me.  To say the very least.

I am left wondering if Windows ISV’s are in fact on the edge of the Delphi community rather than at the heart of it as I thought.  I simply do not recognise an environment that seeks commercial tools for Linux development.  That community is famously well served by free alternatives.  Well served because the Linux ecosystem remains – as far as I know – primarily one populated by free software.

This is not an ecosystem that companies required to pay for their tool chain can sensibly engage in, even at the – relatively – bargain price of an All-Access license.

So where is this demand for full-priced, Linux desktop application development tools coming from?

And why does Embarcadero seem to think that 64-bit is less important?

Mac and Linux at this stage are interesting nice-to-haves.

VERY interesting, certainly, but still only nice-to-haves.

I can only hope that their information is accurate and reliable.  64-bit support is even more overdue than Unicode was.  If Embarcadero leave us waiting much longer they might find when it’s finally ready that there is no-one left who cares anymore.

I would be interested to know what you think.

48 thoughts on “Cross Platform More Important Than 64-Bit?”

  1. My own opinion: cross platform support can be achieved via cross-compiling single source code for different platform + remote debugging.

  2. Must be difficult to decide on which bandwagon to jump on next! There’s wailing and cries of “Delphi is Dying” if you don’t, and there’s dead products and yet more legacy stuff to support later if you do.

    If they have truly sliced through the cross-platform Gordian knot, I’ll be very interested to see how.

    Despite not having any need for 64 bits personally yet, I agree that, with Win7 here, it really has a far higher priority than what I fear is yet another tilt at windmills. Plus, it cannot be as difficult, surely?

  3. Jolyon, you are preaching to the converted. As a small ISV we’ve been waiting on Delphi promises of 64bit for years now. Our specific need is server side support 3GB+ memory, preferably with VCL, but without would be liveable for now.

    We’ve had servers capable of theoretically huge memory spaces and MS operating systems to support that memory for many years now. Where is Delphi? Still stuck with a 32bit address space, effectively limiting us to 3GB addressable (without PAE or WOW etc).

    If we’d known 3 years ago what we know now, then we’d be just one more company swimming in the Microsoft pond, having bitten the bullet and ditched Delphi. We’ve had customers out there now for 2 years with a desperate need for that extra memory address range, and we’re loosing prospects and revenue now because it is missing with no sign of Embarcadero delivering.

    As you say, we can only hope that their marketing research is accurate, but I very much doubt it. Yet again, they are chasing after the candyfloss. Granted, Unicode is important for alot of people, and we took that one on the chin. But to me, the imminent “Weaver” release is window dressing, who actually has a real need to support Win 7 and multi touch now? Sure, it’s nice to be at the cutting edge and be able to pitch the tool at new markets and companies who are looking to get in to native Win 7 development. Everything else in “Weaver” smacks of a holding position until they can get one of the major features complete – 64bit / multi platform.

    Embarcadero must remember that they are effectively withdrawing promises made by Wayne Williams to deliver on the existing roadmap at the time of the Codegear aquisition – Commodore being next in line. That is a loss of long term credibility. They must stop chasing rainbows and address the fundamentals.

    As a tool, Delphi is my spade. What I need is a stronger, sturdier spade, one capable of lifting heavier loads. I do not need one with a multi coloured handle, built in torch, funky web driven compass nor any other gizmo that has no place being on a spade in the first place.

  4. My own thoughts:

    I think 64 bits is a must, like Unicode. But cross platform can give to Delphi a very important support on enterprise decision-making.

  5. If I understood correctly from David I’s visit to Brussels last week, Embarcadero split up the development teams and so they are working on several features at the same time…

    For the people who can read Dutch check: http://www.nldelphi.com/Forum/showthread.php?t=34011.

    I guess cross-platform and 64 bit are both things they should do in their compiler, so that’s probably 1 development team?

    I agree that 64 bit is probably be more wanted than cross platform.

  6. Delphi/Win32 is great, but what if you want to do server-side software, software for mobile phones etc.? CodeGear needs to fit into existing solutions for this. .net is clearly a dead end, but making things run on top of mono, java runtime, python runtime or even something else, is a good idea.

  7. I think that “slow down” the 64bit development is a good choice in the scenario Embarcadero is (developing x-plat and 64bit at the same time in a parallel fashion). And the reason is simple: the financial crisis make companies not to invest strongly in new hardware, so A LOT of the Windows market will remain in 32bit for some more time (there are a lot of articles been published recently about this).
    So, is good to “speed up” the xplat part of the development so we can reach the Mac market specially as soon as possible (oh, and Linux too, but Mac is far more important). Anyway I think that Embarcadero is not going to gave us the 64bit support too lately, im confident they know that is really important too.

  8. Personally, I want Linux before 64-Bit. But for Linux, a real 64-Bit compiler is even more important as 32 bit works pretty badly on a typical 64 bit system. So in the end they have to do both things anyway.

    Something that I really dislike is not having a IDE that runs ON Linux. These days I run Delphi only inside a XP VirtualBox. It sounds a little weird: They want you to believe that cross-compiling is possible (and they will probably say “it is very easy, just hit Ctrl+F9” as they usually do) but don’t manage to port their very own IDE.

    On the other hand – I don’t think I am a very typical customer 🙂

  9. Exactly my point:

    There already ARE native-development tools for each mobile phone platform.

    Is it reasonable for CodeGear to expend scarce resources to fully understand each one of them to effectively produce a competitive native-code development product? Of which the market for each individual one is – how big?

    Or, which developer is going to want to target sluggish layers of mono or java or whatever – just so we can use a Pascal syntax to develop for each mobile phone platform?

    Not to mention the dilemma of unifying the quirks of each OS under one VCL whilst still exposing their individual capabilities efficiently.

    I can understand the attraction of having four buttons: Compile Linux, Compile Win32, Compile Mac, Compile Mobile Phones. But it ain’t that simple.

  10. I don’t *need* 64 bit support (yet), but I know that I will need it sooner than I’ll need to be able to code for Linux or Mac users. That being said, I’m not too concerned about what was said in the interview – truth and the contents of an interview with upper management rarely share the same room.

  11. Windows is weakening in it’s absolute hold on the Client side. With Netbooks and portables, the monoculture is cracking a bit. The more interesting thing that I got from this interview is that we see the mention of OSX. It also appears that the focus is a bit different, as the paradigm appears to be “develop on windows, compile on X”. A bit different from Kylix where the kludge appeared (to me) to be in the IDE. (What a Frankenstein!)
    I think that CodeGear is trying to remain relevant in a multiple OS world.
    My guess is that we will see Cross Compile on Linux and OSX, then will come Android (if the anticipated use of Android on Netbooks accelerates).
    The cross-compilation from a single code base is the hook. If they can pull it off with NO changes in source code, there will be a strong reason to use Delphi for native client development. Imagine writing a commercial app and being able to deploy with no muss and no fuss to Windows, Linux, OSX, and (perhaps) Android? Why would you NOT want to use Delphi.
    The question is if they can pull it off. Personally, I love rooting for the underdog!

  12. You should read the Michael Swindell comments in marco cantu blog, he says that these are 2 parallel projects (with same priority) and they think that xplat (crossplatform) will be ready sooner. Swindell claims that while writing new compiler and vcl may seem harder than implementing the 64 bit, it’s not so, because with 64 bit you have to think a lot about backward compatibility, but with xplat you don’t.
    I am intriqued by the cross platform vcl. If it’s done right it could be revolutionary.
    While i would like to have 64 bit support, i don’t see it being mainstream for a couple years, then again i don’t think that linux or os x will go mainstream soon also.

  13. I can understand the attraction of having four buttons: Compile Linux, Compile Win32, Compile Mac, Compile Mobile Phones. But it ain’t that simple.

    REALBasic does a pretty good job of it for 3 of those 4 buttons. 🙂

  14. I have seen Michael Swindell’s comments but didn’t see anything in those that really contradicted Wayne Williams observations.

    Parallel development may well be going on, and it makes perfect sense. But priorities within the wider organisation determines which of those parallel efforts will be afforded access to the resources needed to get their job done sooner, rather than later.

    Michael Swindell says that both 64-bit and xplat are under way. I don’t doubt him (both are predicated on the rearchitected compiler for one thing). But Wayne Williams says that of the two, xplat will be released before 64-bit not because it’s easier (something I seriously doubt) but because it has higher priority, making specific reference to market demands.

    Michael suggests that xplat is easier because it is a “greenfield”.

    That has some merit, but the 32-64 bit transition is not the same paradigm shift that 16-32 bit was. I personally don’t see that a 64-bit VCL should be *that* much harder than an all-new VCL.Cocoa or VCL.X etc.

    I’m not saying that cross platform support isn’t desirable. Far from it. I think it’s HIGHLY desirable.

    I just don’t think that it is MORE important to the future of Delphi than 64-bit Windows support.

    As for timing… recession is the PERFECT time to innovate, preparing your company to benefit from the RECOVERY.

    When the upswing DOES come, (as it will), and companies do start investing in 64-bit (or rather, making use of the investment they have likely already made – most if not all current and indeed recent PC’s purchased are 64-bit hardware) then the company ready and waiting with a 64-bit tool chain for developers to exploit that will be poised to make hay when that sun starts shining.

    In the meantime, the lure of a new market can be irresistible – the potential for growth is huge compared to trying to grow in a market in which you already have a presence. But it is also much harder to break into a market than it is to grow from a position in a market in which you are already established.

    And the Mac/Linux developer markets are surely still places where hearts rule the minds. Are Mac/Linux developers – as a whole – really going to want to embrace Delphi… the “Windows, Pascal” tool?

    As many Delphi developers that there may be that might be interested in doing Mac/Linux work in addition to their Windows projects, what proportion of the Mac/Linux developer marketplace does that really represent?

    Meanwhile the *existing* Windows developer community is crying out for high productivity development tools for 64-bit native code, because our users and customers are themselves crying out for it.

  15. IMHO they’re going to make another “hole in the water”. I agree that native xplatform could mean a lot for Delphi in the long term, but its immediate market appeal is yet to be confirmed.
    Linux/Mac developers will hardly buy a Windows IDE – especially a Pascal one, and I think many Delphi developers like to talk about xplatform development, but when faced with the intricacies of developing for “alien” OSes, they may give up – and no, your PHP/Python/Ruby skills doesn’t matter here, native development requires a good knowledge of the undelying OS, and Linux/Mac(BSD)OS can be quite different from Windows. And you may not have all the fancy controls and libraries you use under Windows.
    Meanwhile Windows 64 bit is getting quite a momentum. 64 bit processors have been sold for years now – and most computers are 64 bit capable. On desktops large address needs of recent videocards and memory prices are pushing users towards 64 bit OSes. Video editing, databases, virtualization and even mail servers like Exchange are pushing 64bit strongly. Even my entry-level virtual server hosting my site runs Windows 2003 64 bit.
    As they did with Unicode, they’re chasing new butterflies while delaying much needed improvements, because they don’t shine so much on leaflets. Yesterday it was .NET, now it’s cross-platform. Meanwhile long time Delphi customers needing to update their application to fully exploit actual Windows OSes are always left behind.

  16. I do not think entry back into the Linux market will be technically chanllenging. It seems clear from the VCL source that Kylix has been kept up in a back room somewhere. Oh, it may not be a fully polished product any more, but clearly, it is seeing some support.

    That said, they will STILL fail to monetize it because the money just isn’t there. GPL fanatics are destroying and dismanteling the industry one step at a time, and this is just one of the early indicators. The thought that you can make money developing for these people is still seriously misled.

    There may, however, be some money in the Mac market, and chances are that some people will buy the product and it may even have some small success there.

    HOWEVER, if history has shown us anything, it is that small markets like Mac are notoriously difficult to turn into a proper return on investment. There is a reason that Apple sells most of the software you will ever need on a Mac – most vendors would starve to death trying. Unless you find a product that every single Mac user can’t live without, chances are you will go broke sooner than later trying. Apple already has those markets all sewn up.

    And let’s face it, once you take the fraction of the market that are developers of the fraction of the market that ARE mac users – you have some VERY slim pickings indeed. The chances that developers will invest in desperately over priced machines to enter a market that is 5% of what is currently available to them is even more deluded than trying to make money off the unwashed GPL hoards.

    And finally, apparently Embarcadero needs to learn the lesson Borland repeatedly failed to learn. When you turn your back on your core customer base, you frequently regret it. And since 64 bit is very important to that core customer base, you REALLY have to wonder – WTF are they thinking?!

  17. Given I started the thread “WTF from Wayne Williams…”, you’ll probably not be surprised that I agree wholeheartedly with you on this.

    We’ve been holding out for 64-bit for several years. We’ve already told our customers that as soon as 64-bit support is available, we’ll be dropping 32-bit.

    We are developing a large desktop (VR) application, and really don’t care about Mac or Linux…

  18. There are a lot of Macs out there.

    Almost everyone I know under the age of 30 owns one. I’m expecting these kids to lead the charge in introducing them into the office over the next decade.

    Linux I don’t really care much about, but I don’t think Linux support is the main goal of this cross platform effort.

    64bit, while important, doesn’t take me anywhere I need to go right now.

  19. It’s one thing to “lead a charge”, but how successful that assault may be is not dependent on the enthusiasm or youthful exuberance of the chargers, but more on the size and steadfastness of the force they are charging *against*.

    For Mac to take over the desktop Microsoft would need to make not just one but a series of Vista-sized mis-steps.

    On the evidence of Windows 7 so far, I’d say that the notion that the Vista debacle signposted the beginning of the end for Microsoft was premature. They seem to have taken that hard knock and learned their lesson. And well.

    Very well.

    As for which is the main goal – Mac or Linux…. it’s worrying that there are mixed signals on that score. Wayne Williams seems to be saying that domestically (in the US) the Mac is the biggest draw, but internationally that it is Linux.

  20. The way I see things, these two projects MUST be done in parallel, irrespective of which one is seen as “more important” or “easier”. Looking at it from Embarcadero’s perspective, my reasoning goes like this…

    You cannot do an effective cross platform (Mac/Linux) solution that will stand the test of time, without already having crossed the 64 bit barrier. It would be madness to create a whole new cross platform VCL, only in 32 bit, and then immediately have to upgrade it to 64 bit. It only makes sense if it is designed as 64 bit from the ground up, even if you only release a 32 bit version initially, due to other factors (like the lack of a 64 bit production-stable compiler).

    On other platforms, 64 bit seems to already be more prevalent than it is on Windows, as only a very small percentage of “users” in the Windows space are currently running 64 bit. So a cross platform solution that does not address the 64 bit issue would be mostly ignored by the industry, whereas it seems to be largely a “server” issue for Windows at the moment (although that will change in the next year or two) so 32 bits still has more life left in it on Windows than it does on other platforms.

    On the other hand, you don’t want to commit to your 64 bit solution, without having a really good understanding of what is needed on the other platforms first. It would be insane to build a whole new compiler (which from what I have read, seems to be the basis of the 64 bit project) without taking into account what will be needed on the other platforms that you will have to support in the near future. Just knowing this in theory is not enough. To know enough about what the other platforms need, you must have first hand, practical experience. So bring on the cross platform project as a perfect way to get that first hand experience.

    So in summary, the cross platform project is critical to making sure the 64 bit project is done properly, and the 64 bit project is critical to making sure the cross platform project is done properly. These two projects are highly dependant on each other.

    As to which one is “easier”, I agree that 64 bit issues should be much easier to solve, in isolation, than the cross platform issues, and they appear to have been thinking about the 64 bit issues for much longer as well. However this is exactly WHY they must get their feet wet with a cross platform release, BEFORE the the 64 bit solution is set in concrete. The very success of the 64 bit project depends on it.

    We all know (project managment 101) that for any complex project to be successful, you must eliminate your major technical barriers as early in the development cycle as possible, as these will be the most difficult things to fix later in the project. Unfortunately, from hard experience, we also know that addressing complex technical issues only in theory, or in the lab, is not enough. It is not until the prodcut is released to the users, that you find out what you missed in the initial design.

    So for the good of the whole Delphi product line, we end up needing to have a cross platform release, which may seem to be “less important” (to many people at least), before the “more important” 64 bit release.

  21. IMHO, 64bit should come first. Don’t even care about Linux/UNIX Desktop. If targeted for server application, what’s the point in using 32bit OS/application nowadays?

  22. @ncook:

    There is a key difference between 64-bit and the Mac/Linux support tho, and that is highlighted by Michael Swindell.

    For 64-bit we are not talking about an all-new VCL. Precisely the opposite. The objective will be that an existing Win32 application will be essentially only a recompile away from 64-bit nirvana.

    i.e. the 64-bit project is about 64-bit codegen at the compiler level and ensuring that the *current* VCL will compile to both 32 and 64 bit targets.

    The codegen part is the bit that’s tied into the Mac and Linux projects, since they are sensibly tackling 64-bit code gen as just A.N.Other codegen “backend” to a common compiler front end (one can only presume that even the 32-bit codegen will then also get an overhaul, if only to fit into this new compiler architecture).

    But the VCL parts I think are different.

    And I don’t think it will come down to “Old VCL” vs “New VCL”, where the Old supports Win32 and the New supports everything else.

    There simply cannot be a one-size-fits-all VCL that would be acceptable for producing “first class” Win64, Mac and Linux applications from a single source (which would be the only purpose of a 100% single source cross platform VCL).

    Rather I expect – or hope – that there will be a single-source, cross platform foundation (Classes, SysUtils, etc etc) underpinning platform SPECIFIC visual control frameworks, making full use and providing true “native” user interfaces on the various platforms for which such native frameworks are provided.

    Quite apart from the different control sets and styles, consider the very functional and operational differences between an Apple Mighty Mouse and a typical Windows mouse then try and imagine how a) a 100% single source cross platform could cope with those differences and b) how you would go about writing a single-source application consuming that single source VCL that would never the less result in a “first class” application in both Windows and Mac environments.

    Just as a for example.

    I don’t think CodeGear’s resources are best deployed in trying to create “Magic Boxes” to take care of these details that for very many users won’t be of any concern.

    By which I mean, most Delphi users will continue to be interested in Windows development. Some may be interested in using Delphi to create Mac apps.

    Only a small proportion will be interested in wanting to [use Delphi to] create apps that run on Mac, Win and Linux (or perm any 2 from 3) from a single source.

    I think it is possible to say this with some confidence because those people that DO want to do that can already do it, today with Java or C++ and QT. They are unlikely to be sitting around waiting for Delphi to deliver this capability.


  23. To Embarcadero this is my most importent argument:
    My company WILL buy a Delphi Win64 upgrade
    My company will NOT buy a Delphi Xplatform upgrade
    I have been hearing stories about how the Mac will overtake Winods in the last 20 years. Never happened. Not going to. Same for Linux, but for a sligjhtly shorter period.

  24. At the end,

    “Delphi 64” will be the first “cross platform delphi”

    I think Delphi 64 for win32 will not make too much money for Embarcadero, but they need it to be in the present market. (it’s a must)

    Imho, One different thing is delphi for MacOs, there is a very short percent of macs in the computer market but everyone (c# users, c++ user, delphi user, java user) want to develop in the MacOs market and create real MacOs applications because MacOs market is hot now and absolutly in fashion, and this can make big revenues for Embarcadero and take some users from other environments.

    about the VCL, I think if we don’t want to make the same mistakes of Java we need different VCLs for different OSes.

  25. @jjb:

    You missed my point. Code-level cross compilation is only a small part of cross-platform development. Please see Luigi D. Sandon’s comments above.

    But if you have cross platform development experience in REALBasic, could you perhaps comment on this:

    I have an application that programs music synthesizers over MIDI. It uses the Microsoft MIDI API. How would REALBasic compile that for Mac/Linux?

    I have libraries that use intimate Windows knowledge to create a high performance front-end (DirectX?). How would REALBasic compile that for Mac/Linux?

    I have industrial applications that communicate with machines over the RS232 interface (Win95-WinME). It partly uses the old IBM BIOS interrupt routines. How would REALBasic compile that for Mac/Linux?

  26. “kids to lead the charge in introducing them into the office over the next decade.”
    There are several factors that will hinder Mac overall adoption:
    1) Companies are not going to give each employee expensive desktops or notebooks – some high-end employee/manager/executive may select a Mac, but not everybody – other hardware may mean significant savings.
    2) You are bound to a single hardware supplier – which also sets the prices.
    3) Weak server side offerings, both hardware and software. Homogenous (almost) networks can exploit better server OSes capabilities.
    4) Many applications are missing – it is true you can run them in a VM or the like – but if those are your main applications it makes very little sense, use a Mac just for a better browser and fashionable HW??

  27. Delphi 1 was about making life simpler. Nowadays it’s nearly impossible to make young people enthousiactic for Delphi. If we don’t want Delphi to die, we need to consider their needs and not ours. They really don’t give a **** about cross platform or 64-bit. Codegear has to learn “thinking sexy” again…

  28. Microsoft doesn’t produce 32Bit Server Operating Systems anymore.
    Windows 2008 R2 is 64 bit only. The software running on these servers
    will be 64 bit also, both databases and webservers. And since 64 bit
    applications can’t use 32bit plugins, writing extension will soon be very difficult.

  29. I really do not understand Embrcadero ,
    Just look at the past sales of Kylix and decide
    Was it a good decision?

    Delphi lost a lot of developers while borland was busy developing Kylix

    i think the a huge group will go while developing X-Platform.

  30. @Ken Knopfli
    I agree. True cross-platform is a pipe-dream. Another example is any Win32 app that uses the registry. There is no such thing in Linux and on Macs. One of my 32-bit Windows apps uses the Windows multimedia subsystem to capture any sound on the PC and stream it through a winsocket over TCP/IP to any “listening” station running my “receiver” software. This would need entire rewrites to work under Linux and OSX. There may as well be 3 separate apps! Also, I have to keep half an eye out for Win9x platforms which are 32-bit only. Therefore 64-bit is not such a huge concern currently. Any of the 64-bit Windows server products runs my 32-bit Windows software with no speed or compatibility issues. I believe Embarcadero should freeze new features and sort out the mess that is their 2009 product suite – too many issues in QC remain unresolved.

  31. Our clients are screaming for 64-bit versions of the software. All their servers are on 64-bits now, and it is a royal pain diddeling about with both 32-bit and 64-bit Oracle drivers, f.x.

    I can’t think of a single application of ours that I need to have as cross-platform.

  32. In days gone by, Delphi was my secret weapon. Really RAD. Microsoft was a pain by comparison.

    Now that I have to use VS/C# at work, I see how small additions, like the “summary/param/returns” comments feature boost my productivity.

    For those that don’t know, in C# you can comment functions by adding a summary comment and, for each parameter, a short explanation.

    Then, when you use that function elsewhere, the param comments pop up as you type, intellisense style, and the summary pops up when you hover over it with the mouse.

    Going back to Delphi (2007 in my case) just seems so old by comparison.

  33. Interesting that your closing comment is that Delphi seems “old’. i.e. you didn’t intuitively/automatically repeat the assertion that it is less productive in comparison. I may be reading too much into that, but I do often wonder how much “productivity booster” gets confused with “sexy, pretty feature”.

    Especially when it comes to C#/VS, as when people try to explain how it’s more productive the meat of their arguments very often doesn’t actually stand up to close scrutiny and what it comes down to is it’s just new and sexy.

    But when we are enjoying something we may *feel* more productive, but are we?

    And even if there is some genuine aid to productivity, how much of that “boost” is offset by other productivity SAPPING differences in C#/VS?

    Aside: In Delphi 2009 you get a similar feature to that which you describe anyway, although I think it is extracting the information from the help system, not source code (so is likely to be more reliable, when source can’t be parsed due to compilation errors, for example, but admittedly isn’t going to be bang up to date).

    This may have been in D2007 too – I haven’t used it in a while.

    But here’s the thing – it’s a perennial complaint that source documentation isn’t kept up-to-date, so how trustworthy is that feature going to be?

    And does it aid productivity to HAVE to complete that documentation even for trivial code that really doesn’t need it, because I can see that in “formal” development shops that feature will become the subject of a dictat that “all code MUST be fully documented in order to support the IDE inline documentation insight feature – or whatever it’s called.

    So then people have to spend time – even partly automated as it may be – completing documentation that really isn’t and shouldn’t be necessary.

    My own preference: create code that doesn’t *need* documentation to be consulted in order to be used correctly. 🙂

  34. @ Joylon Smith:

    “Interesting that your closing comment is that Delphi seems “old’. i.e. you didn’t intuitively/automatically repeat the assertion that it is less productive in comparison.”

    Oh, Delphi is DEFINITELY less productive than C#.

    I come from Turbo Pascal 3.01 and currently develop privately in Delphi 2007, and have 8 years experience on C# at work, so I hope I speak with some authority.

    When he created C#, Anders took what he’d learned from Java, kept the best of Delphi Pascal and, because he had the luxury of not having to support legacy code, he threw the obsolete Turbo Pascal/Delphi stuff away.

    And by obsolete I mean stuff that was innovative in the day when PCs were slow, RAM was small and the programmer needed to help the compiler along. Turbo Pascal and Delphi were brilliant at this. But they now carry with them structures we have to maintain daily, purely for legacy reasons.

    Whereas C# actively uses the power of modern PCs to help the programmer solve problems. The compiler and IDE is actively assisting, yet keeping out of the way (usually! 🙂 )

    Simple, everyday things make C# development faster?


    Less typing. Pascal is verbose (it was, after all, only a teaching language). C# has no begin/end, function/procedure etc., OK, we knew that. Readability? It took me 2 weeks to make the change. C# is Delphi with C syntax. I felt the Delphi mindset immediately.


    No more Interface/Implementation:

    C/C++ header files were invented to help compilers/linkers because machines were slow and RAM was limited. As RAM got bigger, putting Interface/Implementation sections in the same file became practical.

    But with the huge RAM and fast modern PCs, C# is able to do away with the Interface/Implementation, .h .cpp (all that stuff) entirely.

    Delphi’s Interface/Implementation is obsolete. But legacy dictates it must stay. Delphi is “old”.


    No more variable declaration section:

    In C# you declare variables as and when needed, even directly in loops.

    This makes for finer scoping and there’s no more scrolling between code and VAR sections.

    As with Interface/Implementation, declaring before using was introduced inter alia to speed up compilation. Modern fast PCs made this obsolete.



    In C# creating properties is so much simpler. In Delphi I find I think twice before creating Properties, again Interface/Implementation is a factor here (no, the automation breaks when you decide to rename)


    Automatic Pretty-Printer formatting in C# means I don’t hesitate to refactor. In Delphi, it all too often means manually re-structuring code, slowing me down.


    Help – the main reason I do not dare tell my C# guys to move to Delphi.

    It has nothing to do with keeping Help up to date. I cannot even find normal everyday functions that existed in the Delphi 7 help files! If I didn’t still have my D5 paper manuals and my D7 help files, even MY productivity would be unacceptable. To myself!

    And even if something IS in the Help, the way it just throws EVERYTHING at you, whether Delphi related or not – D7 was brilliant by comparison. What happened?!? Who broke it?


    The summary/param/returns comment tips I mentioned. A HUGE productivity boost, and my biggest “oh, no” experience when I sit down to program in Delphi.

    “In Delphi 2009 you get a similar feature to that which you describe anyway, although I think it is extracting the information from the help system, not source code”

    Possibly I didn’t express myself clearly. USERS can add these comments, to their OWN code, and the compiler, running in the background, makes them IMMEDIATELY available!

    “My own preference: create code that doesn’t *need* documentation to be consulted in order to be used correctly.”

    I would have said that, too. Forget it. Try it. It is fast, does not intrude, you only use it if you want to, but returning to one-year old code and seeing a self-written popup that says:

    Copies ready-made plants from the library
    AsDataID: ID identifying the AS in Prj_AsData and Prj_ASConnections.
    PEMainID: ID identifying the AS in Prj_PE_Main
    procAS: The AS process to which the function chart is to be copied

    is an immediate memory-jogger. And I don’t even have to consult the source code. I just start typing and it comes up on the intellisense list or when I hover over the method in the code.

    Not convinced? All I can say is, try it. Words alone do it no justice.


    Generics: Finally here in D2009, but still flawed I’m told. In C# I was initially sceptical, but when I started to use them I changed my mind immediately. No typecasting errors and I found them more readable (once I got the hang of them). Lists, dictionaries, – just those two have boosted my productivity.


    So why do I even still bother with Delphi?

    1. Smaller distributions, free of huge runtimes,

    2. Vastly better low-level control of GUI, memory, well, EVERYTHING really, and

    3. You can mix object oriented and procedural code, whereas in C#, you are all to often fighting with static creation order problems. (Believe it or not, the object model is not always the appropriate methodology).

    4. Longer lifetime, more or less immune from the whims of Microsoft.

  35. Oh, and I might add:

    5. A LOT of my own legacy code that I’m loath to redo in another language!

    …so there you are. The classic dilemma.

  36. Well, IDE features won’t matter much to Linux developers, most happily using VI and man pages…
    However the point was if multiplatform support is more important than Windows 64 bit support for the next release – and that’s nothing to do with IDE/language features.
    Cross platform development may bring new developers to Delphi than 64 bit (which most of non Delphi developers already have…) , but IMHO it will take time – a lot.
    First they have to show they did it in the right way, but lately Delphi became a little too (in)famous for half-baked solutions – and “Borland non finito” dates back to Midas and Delphi 3. Delphi for PHP was another huge mistake to capitalize on the Delphi name with a tool that is not up to the expectations.
    Developers will be cautious about upgrading and investing too much in a tool that could disappear as Kylix did, especially if a Linux/Mac market for their own applications does not exist and have to be created.
    Also, will be the cross-compiler a standard option, or a separate one, as Kylix was? IMHO having to pay more for a tool when you have to invest time too to learn to code under another operating system may hinder adoption.
    The initial lack of the 3rd party libraries that made Delphi so powerful and we are accustomed too will be a issue, and how many will be able or willing to support two/three operating systems is another.
    And being a cross-platform compiler, it won’t lure many Linux/Mac developers – it could make evolution slower.
    IMHO a Delphi-64 compiler will prompt more actual customers to upgrade than xplatfom, especially if MS start to accelerate the 64 bit transition cutting 32 bit versions as they did with 2008 R2 – and they know Linux and MacOS are running fast to 64 bit computing too.
    It could be a cash infusion, and especially should be used to up the Delphi appeal again, as one of the best Windows native development tool, ironing out many of the issues still around.
    Being a Jack of all trades and Master of none would not help at all.
    BUt at least they – we – need a clear roadmap, because the zigzag of the past years is no longer tolerable.

  37. @Luigi: Yeah – gone off on something of a tangent here. 🙂


    “Whereas C# actively uses the power of modern PCs to help the programmer solve problems.”

    Here’s your first mistake. the power of modern PC’s is best engaged to solve USERS problems, and it’s a programmers job to create software that does that.

    A user doesn’t care how many wizz bang gadgets the programmers IDE or language has to make their life easier, they want software that works and which works efficiently.

    The problem with C# and .net generally is that the user pays for the programmers convenience.

    But is .net software truly and genuinely produced any more quickly than “native” code?

    Not according to what I hear. Quite the opposite in fact. If anything C#/.net projects seem to slip more than Delphi ones, not least because they run into problems caused by the shifting sands of the .net runtime and framework.

    “Less typing. Pascal is verbose (it was, after all, only a teaching language). C# has no begin/end, function/procedure etc., OK, we knew that.”

    Less verbose = less clear.

    Why aren’t novels written using txt mnemonics? It would save a fortune in printing costs and novels would be quicker to write because there would be less typing.

    No one would want to read them of course.

    “C# is Delphi with C syntax.”

    That’s just ridiculous. C# maybe Java with C syntax but it is nothing like Delphi. Just because it came from the same “mind”. But even the best minds don’t always remain sharp.

    “No more Interface/Implementation”

    This a BAD thing, NOT a good thing.

    In C# you have to wade through pages of declarations intermingled with implementations to get any idea of the shape of a class in the source. Sure the IDE can give you a neat summary, assuming your source is “parse complete”.

    I know C# developers who love C# but really MISS the clean lines of the interface/implementation separation.

    “No more variable declaration section”

    Same applies as interface/implementation. Having to go searching through implementation code to find variable declarations is NOT a productivity boost in the long run.

    These two things, and many others, may make it possible to CREATE code more quickly (i.e. conveniently/lazily), but code spends vastly more of it’s life being maintained than being created, and any measure of productivity has to give far greater weight to cost of maintenance than cost of creation (unless we’re talking about utterly throw-away, disposable code).

    “Automatic Pretty-Printer”

    That’s an IDE facility, nothing to do with the language and nothing to do with productivity. Project Weaver will deliver this to the RAD Studio IDE if you really find yourself needing such things.

    And I challenge anyone to prove that refactoring is a productivity boost.

    ime it’s anything but.

    With refactoring tools readily to hand, less thought goes into initial designs of code, because the design can be improved later with refactoring tools if needs be. So people spend time refactoring.

    When doing so, they feel productive because the tools are working for them, but they aren’t stopping to measure the time they are spending refactoring compared with the time they might have saved by thinking properly about the initial design in the first place.

    Getting it right first time is much quicker and cheaper in the long run than fixing it up later.

    This is something I see time and again with newer developers. They throw out code in a hurry with the intention to refactor it into shape later.

    Too often that last part never happens, resulting in really poor quality code that then has to be maintained. But even if the refactoring does occur they really don’t seem to see that they are working ineffectively. They don’t FEEL that they are because they have these great tools that make them feel productive, even though those tools are actively encouraging them to work UNproductively.

    “USERS can add these comments, to their OWN code, and the compiler, running in the background, makes them IMMEDIATELY available!”

    No, you did and I did point out that in Delphi 2009 the info comes from the help system so does not “help” with your own code (unless you also create help that can be installed into the IDE presumably).

    I also made the point that DEVELOPERS (not USERS – incidentally) are notorious for not keeping such documentation up to date. And a developer that is not inclined to take the time to create self documenting code is not really likely to want to be bothered creating ADDITIONAL documentation either, let alone maintain it.

    Your example was interesting….

    “Copies ready-made plants from the library
    AsDataID: ID identifying the AS in Prj_AsData and Prj_ASConnections.
    PEMainID: ID identifying the AS in Prj_PE_Main
    procAS: The AS process to which the function chart is to be copied”

    I look at that and immediately think that you only need the “documentation” because the parameters are not adequately named.

    e.g. rather than “procAS”, why not “aDestProcess” ?

    Without knowing what “AS” and “PE” are I can’t suggest specific renaming for those params, but I’m sure you could name the parameters in a far more helpful manner than is the case here.

    And I can only assume that the method in question is NOT called:

    “CopyLibraryPlant( .. )”

    because otherwise the description of the method is itself utterly superfluous, so I’m guessing the routine is called “Copy” or something equally ambiguous or unclear.

    Generics solving typecasting errors….. perhaps we approach our development work differently.

    I’ve never suffered the type casting problems that some people find generics to be some great salvation for.

    Type safe containers are free to anyone prepared to create them, even without generics. Generics on the other hand come with very real costs.

  38. @Luigi: Yes! But the lagging behind of Delphi and esp. the IDE compared to C#/VS is something that worries me and am privileged to have this
    opportunity to bounce some thoughts off Joylon. I’m assuming Joylon is giving tacit permission to briefly go off-topic on his blog?


    “Here’s your first mistake. the power of modern PC’s is best engaged to solve USERS problems, and it’s a programmers job to create software

    that does that.”

    I agree 100%. But I, too, am a user.

    My customer use my products to solve their problems quickly and efficiently. They expect me to leverage the power of modern PCs to help them

    do that. Else they’d be using a text interface on DOS. Or pencil and paper.

    And by the same token I as a customer of CodeGear expect their products to do the same for me. Else I’d be using a text editor and command line compiler.

    Delphi has been standing still for too long. Time to catch up.


    “The problem with C# and .net generally is that the user pays for the programmers convenience…If anything C#/.net projects seem to slip more than Delphi ones”

    The VCL, it can be argued, is also there for the programmer’s convenience. You can reduce exe size significantly by using KOL or similar 3rd party VCLs, or even further by addressing the Windows API directly.

    On the other hand, why not have the best of both worlds: program with Delphi.Net and inconvenience the user, too! 🙂 (joke…)

    But to make clear: I am not promoting the .NET framework. I could dig up many examples of where the VCL is more mature than the .NET framework.

    I am strictly comparing C# and Delphi as languages, and VS and the Delphi IDE as tools that help the developer.


    “Less verbose = less clear.
    Why aren’t novels written using txt mnemonics?”

    Because a novel is read as a pastime; by and for people that understand the language it is written in.

    Mathematics is written in a terse symbolic style, for mathematicians who know what those symbols represent.

    A program describes an algorithm; for people (and compilers) that understand the language it is written in.


    “”C# is Delphi with C syntax.”
    That’s just ridiculous. C# maybe Java with C syntax but it is nothing like Delphi.”

    I must disagree. Consider:

    * counter int; = var counter: integer;
    * returnType funcName(type param); = Function funcName(param: type): returnType;
    * { and } = begin-end,
    * == = =, = = :=,
    * while, for, do/until type loops,
    * conditionals, branching
    * …I could go on.

    “using” instead of “uses”… and C# develops the concept further as Namespaces.

    Classes, Properties, Inheritance, … all the same conceptually, just written differently.

    Now think back to Asm, Cobol, Prolog, Lisp, classic Basic, PL1, Fortran.
    Barely any similarities between them.


    “”No more Interface/Implementation””

    This a BAD thing, NOT a good thing.

    In C# you have to wade through pages of declarations intermingled with implementations to get any idea of the shape of a class in the source.

    True, if you are reading it with a text editor.

    “Sure the IDE can give you a neat summary, assuming your source is “parse complete”.”

    Correct. It also does navigation for you. And I find I am “parse complete” more easily in C# than in Delphi, in part because I don’t need to keep Interface/Implementation in sync.


    “I know C# developers who love C# but really MISS the clean lines of the interface/implementation separation.”

    Am I correct in assuming from this that you have not yet programmed in VS/C#?

    If so, give it a go. Seeing how some things can be done differently (and sometimes) better has taught me a lot.


    “”Automatic Pretty-Printer”

    That’s an IDE facility, nothing to do with the language and nothing to do with productivity. Project Weaver will deliver this to the RAD Studio IDE”

    Ah! OK, thanks.


    “And I challenge anyone to prove that refactoring is a productivity boost.

    Getting it right first time is much quicker and cheaper in the long run than fixing it up later.”

    “…code spends vastly more of it’s life being maintained than being created, and any measure of productivity has to give far greater weight to cost of maintenance than cost of creation”

    That is what I mean.

    Just to clarify: I was talking about large-scale manual refactoring. Automatic refactoring is useful only in specific areas.


    “Your example was interesting”

    My example was edited together to show the summary popup concept, no more.


    “..a developer that is not inclined to take the time to create self documenting code is not really likely to want to be bothered creating ADDITIONAL documentation either, let alone maintain it.”

    This precise problem is costing us dear. Identifier names that were totally obvious to the original developers are proving less so to the new team taking it over.

    But even so, there are special situations where additional documentation is appropriate, and I expect my programmers to be aware and comment them.

    If you have ever done this in Delphi: (again, just an edited-together sample)

    Procedure myProc(myParam)
    // Do not use before welding arm park contact verified.
    // myParam may not be a negative number

    you can also do this (and with less effort due to automation assistance)
    Pointy brackets removed, they get swallowed by this blog:

    /// summary Do not use before welding arm park contact verified /summary
    /// param name=”myParam” May not be negative number /param
    void myProc(myParam)

    …and when the mouse hovers on an instance of myProc the full description pops up.
    And when coding, the parameter, with the caveat, shows up in the intellisense.

    I know I am having difficulty bringing over how useful this. I guess you have to have used it once or twice to appreciate it.


    “Type safe containers are free to anyone prepared to create them, even without generics.”

    This sounds interesting – I am aware I can learn from you.

    “Generics on the other hand come with very real costs.”

    Agreed. But every higher level language feature adds costs, else we’d all be coding in ASM/machine code. It is the responsibility of the programmer to be aware of the costs and make a judgment.


    Well, I think I’ve I’ve overstayed my welcome!

    This was just something I needed to get off my chest.

    Thank you for listening.

  39. Before anyone jumps to conclusions, we have to look at it from Codegear’s POV. We don’t know, and probably will never know, anything about the data they gathered that has lead them to the conclusion that xplat is “more important” than D64.

    I believe that if they were going to do xplat, it would ALSO have to be 64bit so parallel development is definately what *should* be going on.

    What took them so long to get Unicode in has confused many long time developers. Now that it’s here, it has fractured the community into the Unicodes and ANSIs where the ANSIs don’t want to upgrade because of the work involved.

    Now, if the same kind of thing happens with 64bit? They will lose even more upgrade customers. I know that my current employer has absolutely NO reason for Unicode NOR 64bit, so if he wants to upgrade to the latest Delphi (after 64bit release), it would not be very good ROI for him to upgrade.

    The software we develop is for the US contraction industry and is not and never will be sold overseas. We do not see a benefit to us for Unicode and most likely not 64bit unless Microsoft forces us to. We also will not be exploring xplat.

  40. @EShipman:

    There is a key difference between the approach taken with Unicode and 64-bit.

    With Unicode you get no choice – your app is Unicode unless you go to some length to “ANSIfy” it by modifying your code.

    With 64-bit it will be a simple build target choice. When you compile you will be able to choose 32-bit or 64-bit targets.

    The problem of course is that if you need 64-bit but DON’T need Unicode, well, the only way to get to the 64-bit capability will be to suffer/embrace/overcome the transition to Unicode; along with your users who will “enjoy” a performance hit and memory footprint expansion of some degree and possible disruption to production database systems for ZERO gain (for them – if Unicode was important to them after all, they wouldn’t be current users of a non-Unicode application, would they?).

    This is why 64-bit should have come not just before xplat, but before Unicode, or at least as part of the Unicode exercise (as was suggested by Danny Thorpe a LONG time ago, iirc).


    A welcome can never be over-stayed on this site. 🙂

    But I won’ respond to your specific points in detail this time around – I think we’ve reached the point where we must agree to disagree. 🙂

    To answer just one question though….

    No, I haven’t used C# in earnest because on those occasions when I have been compelled or required to use it I found it so utterly distasteful that I had no inclination to return to it voluntarily.

    But that started right from the very beginning. When C#/.NET first arrived I was excited by it and very interested. To the extent that I started putting together some courseware with the aim of offering training for Delphi developers looking to make the transition to C# and .NET.

    In the process of putting that material together I naturally had to pull back a lot of the covers on C# and .NET and the more I did, the less I liked what I saw, and in the end I dumped the whole idea – if I didn’t believe in changing from Delphi to C# I couldn’t very well stand in front of a room of people and tell them they should.

    The benefits that you claim for it are completely at odds with my own philosophy of software development evolved over 20 years of Windows development using a variety of languages (Java, C++, Pascal, and 4GL’s).

    C# and .NET along with many other curent fads in software development aim to make a developers job so easy that they do not have to think, and in my long experience, the less a developer thinks the more inclined they are to make mistakes.

    Perhaps its the difference between a jobbing handyman and a craftsman.

    The handyman wants to get in, get the job done and get out, and isn’t too concerned about the quality of the work he leaves behind, as long as he can get the job done quickly and it’s good enough to secure payment from his victi… customer.

    The craftsman takes pride in his work and has honed his craft over the years and not only enjoys but values the practice of applying that craft, recognising that there is always more to be learned even from doing the things that the handyman finds tedious and time consuming.

    Which is not to say that I place myself as a craftsman above and looking down on the handymen, just illustrating the philosophical difference.

    In practice I don’t think any one developer is a 100% handyman or 100% craftsman – these are simply hyperbolic examples of the two different approaches that I think may be at play.

  41. @Ken:

    The Delphi IDE has supported XMLDoc comments, and displaying them as popup tooltips for the last couple of versions. It doesn’t always get it right 🙁 , but it’s there, and hopefully getting better with time…

  42. So you think perhaps the problem is that people are comparing the latest greatest VS with an old(er) version of Delphi, and instead of suggesting that we all should try the latest VS IDE they themselves should perhaps try the latest Delphi IDE, in order to make a true and fair comparison ?

    That’s just crazy talk! 🙂

  43. Well, while we’re talking, time is passing and still no 64 bit, or clarification from Embarcadero. I did get a “final” marketing email the other day offering the chance of a free upgrade to RAD studio 2009 if I bought D2009. I suspect that means they are starting to clear the decks ready for the “Weaver” release.

    Anyone got any thoughts specifically on whether we are going to see the “preview” 64 bit compiler as part of “Weaver”? The preview was “promised” for the middle of 2009 and they’ve been very quiet on that front.

  44. @Alistair: XMLDoc – that sounds like it could be it, thanks. I will look it up. (Never saw it documented, tho’!)

    @Jolyon: I use Delphi2007 privately and we use C#/VS2005 at work, but the VS2003 version already supported what I described above.

    @Paul: My theory is Embarcadero (thinks it) needs xplat for its business strategy. And since Embarcadero uses Delphi for it’s own products, CodeGear has some priority to fulfill it’s parent company’s requirements. Embarcadero’s continued success is a pre-requisite for CodeGear’s future existence.

    Altho’ I would have thought databases are one of the first product categories that would profit from a 64bit memory space.

  45. @Joyon: Oops! I missed your previous posting!


    “A welcome can never be over-stayed on this site.”

    That’s a relief! Thank you. I feared becoming a nuisance.

    “we’ve reached the point where we must agree to disagree.”



    “…C# and .NET and the more I did, the less I liked what I saw…”

    .NET, yes, agree.

    The .NET runtimes thread themselves so deeply into the system, it’s a cheek. And every version requires installing new one, and they just get bigger and bigger! And each one repeats most of the same functionality.

    After 8 years programming C#/.NET, I can tell you the framework is not as mature as the VCL.


    “…Java, C++, Pascal, and 4GL’s…”


    In the beginning, one had to program very near to the processor (my first programming experience was toggling instructions into the memory of an HP2100A). Even data I/O was done by specialists.

    On the other extreme, the 4GL dream was you just tell the computer what you want done, and it works out the details itself. An illusion, as it turned out.

    However, The reason we use higher-level languages is to find a middle way, whereby we programmers can focus on the problem domain down to every last detail, but can still address the needs of the machine where necessary. Ideally without it getting in the way of our train of thought.

    As machines got more powerful, users have benefitted. But we programmers have also stolen some of that power to make things easier for ourselves. We no longer need to define which sectors on the hard drive our program uses. On our Siemens mainframes, this was the program’s responsibility. Now it’s all dealt with by the OS, which takes clock cycles and RAM from the user, but today there is now enough for this task.

    Even so, sometimes a PC will hesitate when the user saves or deletes perhaps because caches are being compacted or flushed in the background. Slow as those old Siemens mainframes were, they owned their clusters and never hesitated.

    There is a cost/benefit choice to make.

    The same applies to the tools we use. I recall how amazed I was the first time I could compile; and the error file would appear automatically in a separate window without me having to manually open it every time. And when I scrolled down the list of errors, the editor window would jump to the line number in the code! Revolutionary! But it was taking up precious RAM. There was a cost/benefit balance.

    By comparison, the Delphi IDE is a hog. And compared to the PLm libs I used back then, so is the VCL. But modern PCs take that in their stride and if we decide the resultant rapid development benefits both us and our customer, we make that sacrifice.


    “C# and .NET along with many other curent fads in software development aim to make a developers job so easy that they do not have to think, and in my long experience, the less a developer thinks the more inclined they are to make mistakes.”

    Oh, I can identify with that! Turbo Pascal 3.01 made me lazy. And careless. Just like people that stand on escalators, I was taking the time saving Turbo Pascal had given me and squandering it it by programming carelessly and just hitting Run. Bad.

    But the fault was mine, not Turbo Pascal’s. I took a while to pull myself together.


    I will sum up my argument in two aspects:

    One, the increasing power of PCs can improve development, with or without putting demands on the customer’s machine.

    So for example, Search/Replace functionality (trivial example) speeds development but the product the customer gets does not suffer by our decision to use an IDE with such functionality.

    But the decision to target .NET or the VCL runtimes or even WinAPI directly DOES impact the customer.

    So there is a cost/benefit balance. Targeting the WinAPI directly may make for small and blindingly fast code, but if it never gets finished, nobody benefits. Whereas the clunk that is .NET is too far the other way. And that leads me on to…

    Two, the balance point driving this decision is largely determined by the current state of PC technology. And here we almost certainly will disagree, but I posit Delphi the language and Delphi the IDE is now lagging behind the cusp of that

    balance point.

    I blame Borland’s internal conflicts of interest and I fear I see the same situation developing at Embarcadero. But Delphi the language and the IDE development is finally on the move again, so I’m looking on the bright side.


    “Perhaps its the difference between a jobbing handyman and a craftsman.”

    And programmers new to the field. Concatenating strings. Do they understand what is happening there? And the latest LINQ technology. How does this impact SQL optimisation, something that is often database vendor dependent?

    Learning to use these new tools is one thing. Understanding the cost/benefit balance of what is happening under the hood is another.

    As Joel Spoelsky points out, abstractions are leaky. And whilst layers of abstraction may make programming superficially easy, even experienced programmers may struggle to find out what the heck is going on.

    Right: if I didn’t overstay my welcome previously, I most certainly will have done so now!

    Thanks for your blog and have a good day.

  46. unicode was most important, now it’s 64bit. cross platform is second. I totally agree. how long i’m going to wait is not known…

  47. we can argue over if 64bit is that important or not, but one thing i think all delphi developers can agree is that Embarcadero is really bad giving information about their plans. I understand that plans and roadmaps can change (so they dont want to announce anything to early), but i hate that we don’t get any information about Delphi’s future from the direct source and have to speculate based on the bits we get from interviews, blog comments and conference appearences.

Comments are closed.