Friday, November 23, 2007

Which is better - cake or CAKE?

In all intellectual debates, both sides tend to be correct in what they affirm, and wrong in what they deny. - John Stuart Mill


Big letters don't make differences for a cake. The taste matters. Isn't it?

I was not aware of the debates going around in the industry about Ruby. Thats basically about comparing Ruby with other programming languages. Before proceeding further, I would like to insist that am passionate about C# and deep into the language's ability. And I write this article because for my previous post, I ended up with debates floating around with invalid (atleast to some extent) questions like that above.

Here is the famous Q&A model of writing an article which paves way to debates, but I need it in the form of a healthy discussion without hitting any individuals.


1. Should we compare Programming Languages?

Not all the questions need to be ended up in a single Yes or No answer.

Not much surprising, developers who are involved deep into technology never hesitate to compare languages. But those developers who are working as end users of any specific compiler for years, stick to that particular language, never get out of it and start to advocate on its behalf (some kinda addiction and possessiveness). The latter's stand is always that their pet language is the best in this world and the creator of that language is their God.

There is no point in sticking to any lanuage. Take the case of Microsoft. They started with BASIC, added 'Visual' to its name. Later they jumped into the so-called powerful VC++, and then with frustration created their own language C#, again with the same prefix "V." If nobody is there to compare programming languages in Microsoft, could Visual Studio be in the market, or could C# be? Please don't start any out-of-scope debates here whether am a proponent for Microsoft, or C# is better than Java/C++, or so with endless concerns.. and thats not the intention of this post. They are already under hot debate.


2. Can we compare an Interpreted Language with a Compiled Language?

This is a question which I faced indirectly for the previous post in my blog. The question I faced was, "Hey joker!! How dumb are you to compare C++ with JUST a scripting language, Ruby!!" I felt sorry for that friend. I have rephrased it to the question above?

In this ever-developing world of OSS, there is no difference between a compiled language and an interpreted language. Hot under the collar?? You should "not" get into academic text books to see the difference between the both. Rather, you must look around the industry altogether.

VB was an interpreted language, and now its getting compiled into IL (Intermediate Language) in .Net. (FYI, C# Java are all semi-compiled languages i.e., semi-interpreted languages) After all, a question raises here that "What do you mean by a compiler?" Root back. Get into books. It does not mean thats a conversion of a high-level language to Machine-understandable words. It CAN also mean the conversion of a very high level language into an intermediate level language. If you cannot go with this point, then you should not work in .Net or Java !!


Here is another crazy scenario for those developers adhering to compiled languages. Are you aware of a C/C++ "Interpreter"? Go through this, reference [4]. Thats okay.. are you aware of an emerging Ruby "Compiler"? Go through this, [5]. And thats why I suggest (not insist), don't say any language as Interpreted or Compiled. Rather, that should have the general label "Programming language" using which you communicate with the underlying machine.


There are setbacks for any Interpreted language. One of them importantly is - Interpreted languages are slower than Compiler Languages. The reason is that the code is validated against the syntax at run-time (parsed), dynamically linked, converted to the native language or an intermediate language. Whuf!! It really takes time.

The pros of the same family mostly revolve around easing the job of a developer. What he sees in the code is what he gets. They are easy to read, elegant to read (thats the only point I advocated in my previous post

The pros and cons doesn't end here. There are many. But, the general opinion is that Interpreted languages eats CPU cycle to make the developer's life simple.

I would like to throw lights at this point on well-biased comparative studies from both the ends [1] and [2]. Again, please don't debate about Mr.Naidu or Mr.Vincent here:) Take their words. Leave comments about them and their article there.


3. So, Interpreted languages should not be used in projects targeting at scalability?

If you have interpreted like the way, then it shows that you are stuck to a compiler. Increasing your hardware cost would fix it. As I already said, many (and not all) scripting languages are easy to read and hence easily maintainable (though the learning curve is not pleasing). This leads to an increase in productivity of the developers.

In simple cases, increasing the developer head-count is not better than increasing two servers, from the perspective of a CFO. Spending $40000 for 2 developers and $100 for a server is not better than spending $25000 for one developer and $300 on 3 servers. But kinda win-win between the CFO and fellow developer!! Again, please don't debate on CFO's profits over Developer's profit. This is not the context. Thoughtworks is an example for the above scenario. They are pioneers in increasing developers' productivity dramatically and they are into many Ruby projects. (FYI, not JUST Ruby projects, but many others)


4. This could be a valid point for projects involved with big (comparably ;) servers. So, what happens in the case of small hardware equipments? Can we use Scripting/Interpreted languages there?

There is a general misconception that Interpreted languages cannot be used in hardwares altogether. Popularity of a language has nothing to do with its ability. If C++ or C is used in most of the hardwares, that does not mean they are the only languages fit for hardwares. If you are looking for pucca scalability in hardwares, then you must end up doing it in machine language or in worse case, using 0000001010101010... :) and not using C/C++.

Am not aware of hardware programming, but at least have went through some articles to justify my stand. Interpreted languages can be used in hardwares. Then, what about the CPU cycles ate by those villainous Interpreters? World is moving towards compressing GBs into millimeters. Hardwares are getting better to withstand any load and sorry for those who work in legacy hardwares (You have the privilege to stick to Bjarne Stroustroup). Am speaking here about present and future, not history. And as of I know, there are hardwares which run on Ruby.

And again from a CFO's perspective, if you need 10 promising, experienced developers for developing a "scalable" hardware using C/C++ or any compiled language, then you need 2 hardware engineering architects and 3 software developers (or even 5 or 6 is also scalable for the pockets) to do it in ever-pleasing and easily maintainable interpreted languages.

I stress, am not even a novice developer for hardwares. But the above points could be validated against any resources available.

To make the counter-part happy, here is an excerpt from the reference [3]


Java runs at 1.8 times the speed of compiled C; Lua (using a JIT compiler), at 3 times; Python, at 6.7 times; PHP, at 7 times; Perl, at 9.8 times; and Ruby, at 16 times. So, where performance is critical, Java or a compiled language will fare far better than any dynamic language.


And I believe that CFOs and their company's clients will never give up feasibility in terms of Money over feasibility in terms of programming languages.


5. So, what is your conclusion?

Conclusions are upto you. But, here is mine:

It depends upon the situations. If you are targeting towards productivity of developers, then many (not all) of scripting languages are there to help you. If you are targeting towards a legacy hardware system and largely available developer community of Compiled languages (lazy enough HR department to catch a few scripting language developers), then go with the compilers. This is my opinion.

A word of caution: don't believe in any other's suggestions (here or in the comments that may follow). Research for yourself and end up with the truth, not even the fact!!

Here are the references I have used as a ground-work for writing this post:
1. http://sapnaidu.net/blog/?p=67
2. http://www.artima.com/forums/flat.jsp?forum=123....
3. http://www.infoworld.com/infoworld/...
4. http://www.ddj.com/cpp/184402054
5. http://www.eweek.com/article2/0,1759,1996960,00.asp
6. http://www.devx.com/RubySpecialReport/Article/34497
7. http://www.reybango.com/index.cfm/2007/...
8. http://www.radicalbehavior.com/5-question-...
9. http://rubyhacker.com/ruby37.html
10. http://www.activestate.com/Products/komodo_ide/?_x=1

Here are some important excerpts from these links, and the last one is important:
Ruby is 16 times slower than JVM"


Ruby is slow.


Ruby is notoriously slow, but we have lots of ideas for speeding it up.


I would say Ruby is Relatively Slow. Ruby does offer a significant amount of power and dynamicity. These Core and Much

Beloved features of the language and the Rails framework contribute to its Relative Slowness.

Fact is, when you make the machine do it, instead of the programmer, there is some expense to pay. These arguments are the same arguments used for and against ColdFusion.

Sure, the Twitter people could have implemented the whole site in Hardware, if they wanted pure On Metal speed. They chose to use technology that got them off the ground much faster than a Hardware/ Assembly/ C/ C++/ etc based platform would have gotten them.

You don't get both sides. Reference: [7]


This post is available in printable format. Click here to view or print

This post is available also in pdf format. Click here to View or Download

No comments: