Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Good article, but I have to disagree with you regarding the C# ecosystem. Sure, its mainly windows, but there is boatloads of tools and support out there for it. Visual Studio is a great IDE, and the documentation for C# and .NET is generally good.

I'm a C# developer mainly, and recently did a bit of Java development in Eclipse (Android ADT) and did miss some features of C#, but the languages are still very, very similar.



There's definitely a lot of C# code out there. I do think that open source C# code is more limited than some of the other popular languages. However, the .Net standard library does so much that you often don't need much more other than perhaps a custom protocol implementation or some UI code.

I think Windows is the major drawback to C#. No one with half a brain is going to write server software that only works on Windows. Desktop software? Sure... I wouldn't write desktop software in anything else (at least for mass market software). I admit to not having worked with mono, and I think it is kind of a catch-22 - if it's not a common framework, no one will write for it, and if no one will write for it, the framework won't get much love... which is a shame, because Java obviously is in dire need of some competition to spur innovation over there at Oracle. Using Java feels like going back in time to 2002.


Although the core libraries in .NET are huge, many are also very shoddy. There are also a few great API's, but the overall quality and usability is pretty uneven. This is mostly a problem with API's back from version 1.0 - however, unfortunately, those API's often haven't been updated. Documentation is also not always equally good; MSDN has lots of pages that describe a method Bar.Foo() as "Fooing a Bar", which is next to useless. I get the impression the java APIs are often better documented.

NuGet might really turn things around; here's to hoping...


Yeah, the worst ones are the stream ones, some what ironic given that he singles one out over Java. The IO one may have `ReadToEnd`, but it's still nowhere near as simple as Python or Ruby, it just shouldn't have to take that many lines of code to read a file.

And these days I include RestSharp pretty much straight away if I'm going to be doing any HTTP calls, the native HTTPClient API is horrible. But even that's not got the greatest API, there's absolutely no good reason to have to declare a `RestClient` before doing a `RestRequest`, it's just a waste of lines of code. You always end up just writing a wrapper round it.


There are ways in the .NET framework to read an entire file with one line: File.ReadAllText("file.txt"). And the System.Net.WebClient class provides ways to make simple HTTP calls in much fewer lines than WebRequest does.

Likewise, Python has ways for interacting directly with streams.


WebClient's API is essentially buggy; it cannot support encoding properly since the order of operations of the API is different from in HTTP, and further the response headers aren't properly exposed to let you work around the limitation.

It's fine for a quick hack, it's useless for a reliable web client.


Why not write server code for Windows? I fail to see a major reason that would stop one from doing that. Actually, its a pretty good, well supported and documented and stable platform.


It means that hosting your code suddenly goes from a simple equation about what hardware will cost you, in terms of actual servers, virtual servers or other solutions to a matter of involving licensing software just to have your code run.

It means that if you ever need to scale up, you'll have to buy more licenses. It prohibits unlimited, automatic scaling.

It's also the once a month embarassing ordeal with sites being down due to server reboots after another round of Windows Update.

I deploy lots of stuff to Windows-servers, but I can see why some people would avoid it if they can.


Everytime I see someone talking about "unlimited" something my spider sense ticks off. Except for Google, Amazon and some other big guys, most of companies will worry about lots of things before what you mentioned become a problem.

Also, AWS, Azure and AppHarbor have takem the license problem out of the way.


> Everytime I see someone talking about "unlimited" something my spider sense ticks off.

Sure. For some values of unlimited it's purely an academic question. But if you can scale automatically to handle double load (which is a more real concern) chances are you can scale further without any other changes.

And yes, AWS, Azure and AppHarbour all have some of the licensing issues handled for you, but that can also be clearly seen when faced with the costs.

Again, I' not saying that nobody should ever use a Windows server for anything. I use it myself. I just say that I can see several reasons why people would chose to avoid it.


AWS doesn't really take the license problem away -- it just rolls the cost into the price of the instance, which is significantly higher for Windows.


And will only become a problem when you reach a huge scale. If by that time it is still a problem then maybe your problem is not having a good enough business model.


We're seed-funded and have somewhere around 70 instances in AWS, almost all Linux. In our business, our margin depends largely on our operational costs, and using Windows would be a significant hit to our bottom line.


So if you had to double your servers (say to 140) would you be at loss?

I hope that this is just the initial phase of the company (although not too initial since you need 70 instances) and that you will get to a better margin along the way. Otherwise ANY bump you hit road ahead will break you.

Competing on a market that relies entirely on price is a death sentence, unless of course this is your plan.


This. If you're just running a single server inside your enterprise firewall that will only need to scale as the company hires new employees, it's probably fine. However, if you need something that can grow with demand from customers (which is usual 3-5 orders of magnitude greater growth than you'd see from your employee base).. then the licensing and costs become prohibitive.


Actually it means that you may need to consider a solution like Azure if you have a public facing site or app and if you want a MS solution.

There's other ways of patching servers besides all-at-once...


But what you're not understanding is that they're costs to the alternatives. There's costs to deploying on Linux. There's costs to building software in Java as opposed to C#. There's costs to hiring Linux sys admins as opposed to windows sys admins.

If you don't look at the big picture costs than you lose out on a lot of options.


"No one with half a brain"

This does not contribute to the discussion.


You're right. It was poorly thought out and unnecessarily inflammatory. My apologies.


No one with half a brain is going to write server software that only works on Windows.

Once you step out of your little bubble, you'll realize what a stupid comment that is.


I guess I should have qualified my statement better. When I said server software, I meant servers to host cloud services that need to be easily deployed and scaled with demand. If you're writing a server that's only used inside your own company, and thus doesn't need to scale quickly, it doesn't really matter... in which case, Windows is a decent option, especially if you have a lot of windows devs in house.

However, for cloud deployment, there are a lot of hassles with scaling on Windows, mostly due to licensing costs and hoops you have to jump through for that.

BTW, I write server software for Windows in my day job, so I'm not some Linux zealot who's bashing Windows for no reason. Windows is fine. Non-free licensing is not (when a good free alternative exists).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: