Honestly I always considered Objective-C to be one of Apple's secret weapon. Both higher and lower level then Java. Objective-C stays closer to the spirit of Smalltalk yet presents no hoop jumping when you need the performance of C.
That, and Cocoa was not designed by morons - protocols and delegation, not inheritance, rightfully rule the day.
Protocols and delegation are only consequences of the true difference between Objective-C and the Java/C# mindset - which is that messages between objects are independent of the objects that send and receive them. In other words, messages are independent of the actual function call they cause in the end. So you can store, forward and broadcast messages, delay invocations, change invocation targets (classes, not just objects) on the fly etc.
Which is nice except that Visual Studio 2010 seems to be a unreliable pig of an application. Which is a shame as I really rather like Visual Studio 2008.
Adding messaging to C to create Objective-C must be one of the most elegant language hacks ever. Brad Cox and Stepstone deserve all the credit not Apple (NeXT).
I'll agree with you on the protocols but not on the delegation. In fact apples engineers are all over the place on applying delegation and event handling. Some components have up to 3 different ways of handling callbacks - delegation, events, and notifications. Delegates are fine when you demand a 1 to 1 handshake, but they shouldn't be used for general callbacks.
Agreed, for the most part. Having closures in the language can really take away the repetitiveness of writing delegate protocols and conforming to them. I wonder if they will full absolve the need for delegates?
delegates do make it a ton easier to write performant non-blocking code. It's almost like writing code for nodejs in a way, except without the deeply, deeply nested function calls.
There are 7 delegate callbacks, 2 of which are voids so they're not really delegation methods.
There is also a large list of UIControlEvents also sent around value change and touch events.
Lastly 3 NSNotifications can also be dispatched by this object when data changes.
Some of the list components are equally bad. The rule of thumb for ideal delegate vs event models is this. If you need a value returned, it should be a delegate method
I prefer to think of it more as "this language has all the memory safety of C combined with all the blazing speed of Smalltalk."
I'm trying to like Objective-C but I don't. It feels like coding simultaneously in two different languages and you're never quite sure which one will be less painful to use.
It makes you write statically typed code, but you still can't figure out what methods a class has in the IDE. On top of that half of a protocol usually doesn't need to be implemented but you have no idea which parts, and the IDE again can't just fill it out for you. I find I spend more time looking through documentation and copying interface signatures than coding. It has some kind of 'automatic memory management' but it seems barely better than that of C. On top of that if you get a compiler error it usually points you to a piece of code that has nothing to do with the actual error. I'm sure Objective-C was impressive in 1986, just like Java was in 1994, but it's 2010.
> you still can't figure out what methods a class has in the IDE.
Maybe I don't understand what you mean but isn't that just an XCode auto-completion feature? Do you mean you can't see which methods belong to the class and which are inherited?
> It has some kind of 'automatic memory management'
Initially I was not too impressed by the alloc/retain/release/autorelease parts of Obj-C but now that I grok the concept well, I think it's awesome. I'm only coding for iOS devices but my understanding is that for OSX, there is a stable GC system. On iOS, you manage the memory yourself. I would have liked to not worry about memory at all but it's not a big deal after all. If you alloc/copy something, make sure you release it.
I mean the intellisense (autocomplete + docs + method signature) in xcode blows chunks compared to Visual Studio or even Eclipse.
Lots of things individually in Obj-C/Xcode are minor annoyances, but when you add them all together it's a very annoying toolchain. Comparing the UI/Polish in Apple products to Microsoft products, I figured that XCode would be one of the best dev environments ever. I think it's one of the worst.
It's not that any of the issues are insurmountable, I'd just rather not deal with them.
I mean the intellisense (autocomplete + docs
+ method signature) in xcode blows chunks
compared to Visual Studio or even Eclipse.
That last sentence left me puzzled, seeing as how eclipse has much better "intellisense" for java than VS does for c++ & c# (at least out of the box - resharper brings it close to eclipse-levels with c#).
And if for whatever reason Resharper isn't available to you, then check out the Productivity Power Tools. They'll take you in the Eclipse direction. Not all the way, mind you, and they're buggy, but it's helpful nonetheless.
Completely agree about the autocomplete and documentation in XCode. It's a complete waste and for a company that spends a lot of time focusing on snappy UIs I don't understand why its so slow...and stupid.
The one redeeming quality is that Instruments and the tools package is quite nice. Our company pays good money to get the same level of software for visual studio.
To me, not familiar with ObjC/OSX memory management, the last sentence reads like "...it's not a big deal after all. If you malloc/fopen something, make sure you free/fclose it."
The difference is there is core support for reference counting and cleaning up unreferenced items. Any time you are using an object, you grab a reference; at the end of a call chain (managed as a part of message passing), if an item has no references, it will be deleted automatically.
With C, you are forced to care who actually allocated something and figure out a protocol for dealing with getting it released.
With Objective C, I can allocate something, hold my reference for the duration of the call, release my reference and return it. If the caller wants a reference, they can then grab it; if they don't, it will get deleted.
Rather than being responsible for ultimately disposing of an object, they need only tell the runtime that it is no longer needed.
A retain count (a count on other parties with an interest in an object) is maintained by the runtime. When the retain count reaches zero, the object is freed. When an object is created (using alloc or copy), it has a retain count of 1. Another object can express its interest by calling the "retain" method. When it is done, the "release" method should be called.
If an object is being returned, but shouldn't be freed right away, there is a mechanism called "autorelease" where the call to release is delayed until the end of the current runloop, giving other objects a chance to retain the returned object.
Cocoa uses autorelease pool for short-lived objects, so most of the time you don't have to free them.
For long-lived objects you can use synthesized properties which handle retain/release for you.
In other cases you "guard" your use of object with retain/release. You can use these in pairs in the same place, rather than having malloc() in one part of the program, and free() in another.
You can freely pass reference to an object without worrying that you might free it while it's still referenced somewhere else. You don't have to think about ownership of an object.
Indeed. I've become quite enamoured of late with the automatic reference counting in the Boost C++ library as a happy medium between GC and manual reference counting. Of course, ObjC has no static allocation, so that can't be done here.
ObjC on iOS is still memory management but the advantages over C are: reference counting (makes composition easier) and a defined protocol for when references wont be used anymore.
According to the link you posted, that's over four times slower than C++ virtual method dispatch (and note that most method calls in ordinary C++ code are not virtual). Also, IMP caching is hardly easy and is quite fiddly: see for example the benchmark code in that link as well as [1].
I highly recommend reading the tour of objc_msgSend [2]. It's very eye opening: for example, there's a cache scanning loop even in the fast case.
>According to the link you posted, that's over four times slower than C++ virtual method dispatch
Hmm, check the results in the same page for the 64bit Cocoa runtime. This is where most optimizations are made, because Apple used the 32->64 bit transition as an opportunity to do most binary compatibility breaking changes to the language.
There, in 64bit, the IMP cached msg send is actually FASTER than C++ virtual method dispatch.
And if your app's performance suffers because of method calls / msg sends, you are doing it wrong. They should be an insignificant amount of time compared to the actual processing stuff done by your app.
The C safety / smalltalk speed quote probably was quite funny when it was relevant --some decades ago.
And all the rest of your "language criticism" is all about the IDE. Coincidentally all solved by Xcode 4: better autocomplete, automatic fixes, relevant error descriptions, etc. The main problem was GCC --it just didnt provide the info and interfaces to use for real time, AST guidance of the IDE. In XCode 4 Apple implemented all that stuff leveraging LLVM.
Apart from that, Xcode < 4 was not a bad IDE by any standard. Worse than Eclipse/VS in several ways, yes. Bad is a stretch. It also has excellent profiling tools, and a great GUI editor (Interface Builder) that gets MVC right.
Also: obj-c has had real GC for years by now. And the old scheme, still used in iOS of retain/release in not "barely" but extremely better and easier than C, while maintaining the performance benefits.
EDIT: Site's nuked for me. Main page cache from the fireballed.org site:
Sony’s Networked Application Platform is a project designed to leverage the open source community to build and evolve the next generation application framework for consumer electronic devices.
The developer program gives access to a developer community and resources like SDK, tools, documentation and other developers.
The foundation upon which this project is base comes from the GNUstep community, whose origin dates back to the OpenStep standard developed by NeXT Computer Inc (now Apple Computer Inc.). While Apple has continued to update their specification in the form of Cocoa and Mac OS X, the GNUstep branch of the tree has diverged considerably.
The GNUstep core libraries strictly adhere to the OpenStep standard and OPENSTEP implementation. They consider changes and additions to their API only under the following circumstances:
They add methods and classes, either from Cocoa or their own extensions, if they add substantial value and don't interfere with OpenStep and/or Cocoa compatibility.
They generally don't remove things unless there is a clearly better implementation in newer Cocoa API
Where there is a real problem with a change, they will attempt find a technically superior work-around. In rare cases, this might involve a change in the original OpenStep API
We depart somewhat from the GNUstep adherence in that our goal is to thoroughly modernize the framework and optimize it to target modern consumer electronic (CE) devices. These modern conveniences include such features as touch displays and 3D graphics.
I'm sure they aren't writing another phone OS, but what devices exactly are they targeting? TVs, set-top boxes, MP3 players?
I loved the hardware of a Sony MP3 a bought a couple of years ago but it was ruined by absolutely atrocious syncing software. I hope this means they open up their interfaces a little more and let other people integrate with their products.
PlayStation 4. It's a Good Thing, but I've been overly optimistic about Sony adopting a policy of greater openness before. IIRC devices classed as computers incur lower import charges in the EU than devices classed as entertainment accessories (ie games/toys). Hence Linux on PS2, PS3, and...?PS4? So far, this has been something of a letdown after launch.
Sony had, a long time ago, a family of Unix workstations. At that time, those systems were called "open" in the sense that they played well with other systems over documented protocols.
"SNAP stands for Sony's Networked Application Platform and is the early stage of a new ecosystem for making downloadable 3rd party applications available to networked devices like TVs, Blu-ray Disk players, etc."
I got approximately 1.5 pages of their server to load. I think it's being used for "smart" TVs first. It sounds like they're trying to build an Apple-free dev environment familiar to Apple dev's.
Quoth the raven:
SNAP is a completely open-source licensed (for license details click here) development environment based on GNUstep.
Historically, GNUstep is based on the OpenStep specification developed by NeXT (now Apple Computer Inc.) plus additional extensions added by Apple through the Cocoa framework. SNAP is an extension of this framework created by Sony and will take a big step forward in evolving a new application framework.
The initial areas of enhancement target the windowing system and graphic framework but further modifications and refinements address networking, widgets, UI building, 3D graphics, file system, garbage collection, performance and execution size. The overall goal of the SNAP project is to develop and evolve a next generation native application framework for embedded devices. While SNAP does not currently target a specific hardware platform or product, the long-term goal is to enable SNAP on Consumer Electronic (CE) devices.
wow. wouldn't have guessed this would happen in a million years.
we are all well aware of apple's successes in the last few years, but they all seemed to be happening in their own little walled-off corner of the world. it's typically apple's tech versus the rest of the industry. now here's a sign that apple's influence is being felt at even deeper levels.
i guess webkit was another sign of this, but that doesn't seem as stark as some other company adopting objective-c.
I don't see how this is relevant. Obviously KHTML is now surpassed by Webkit because most of the development for KHTML was diverted into Webkit + a huge influx of new dev from Apple and now Google. It doesn't remove from the fact that Apple benefited from the work the KTHML guys did. They could have used Gecko, but they didn't, obviously there was something good about KTHML.
Still, neither GNUStep/NS, KHTML/Webkit, or even the Mach/XNU kernel were native Apple tech. To portray them as that in a "Apple VS. The World" scenario is a bit misleading and presumptuous. Webkit and KHTML are still both mutually beneficial projects.
I don't see where an Apple vs the world argument is made. More like Apple did it's own thing before and no one else cared. Now they are influencing other companies more than before.
Along with WebKit, I'd say that app stores and touch UI are other areas where Apple has been influencing tech in recent years.
I think the leading example of "apple's tech versus the rest of the industry" is the use of usb - which only goes to show how weak the basis of the meme is.
Allenbrunson's point is that WebKit is an example of Apple's influence beyond Apple. You could argue that the choice of WebKit wasn't important to Chrome- that they would have been just as successful had they chose another rendering engine. But they did choose it. And that is either true irrespective of the fact that WebKit is based on KHTML.
How so? Squirrelfish was released (at least in source form) in June 2008, V8 wasn't announced until September of that same year. Apple, Mozilla, and Opera were duking it out in public for months before V8 was on the scene.
WebKit most probably did not have any influence here, but was mentioned because WebKit in the past, like "Cocoa/Objective-C" in this instance, is being used elsewhere. Hence, Apple's technologies are influencing adoption in the industry (yes, gnustep is not from Apple but for the most part, Cocoa is a distinctively Apple technology).
This is a great coup for GNUStep, back in 2006 when Greg took over as maintainer he posted a blog entry about future plans. Lots of people on slashdot pretty much mocked him and the project.
There were some download links for an SDK and instructions. I didn't download anything right away (stupidly), and when I came back a couple hours later, all that stuff was replaced by this "currently on hold" message. Sigh.
EDIT: You can see info about SDK stuff on Google's cached version of some pages.
GNUstep being more or less a re-implementation of Cocoa (and prior to that OpenStep). I'm not suggesting any copyrights have been infringed upon (indeed, I'm an avid user of GNUstep), but I always wondered if GNUstep would fall foul of the same kind of patent threats that Mono detractors seem to feel it is at risk from if it ever gained commercial traction. In fact, Apple has already made legal threats against manufacturers of Android phones, Android having nothing to do with Cocoa or Cocoa Touch.
Jobs was reportedly greatly influenced by Sony's late-70s/early-80s design sense, especially wildly popular consumer products like the Walkman. This book has a brief mention of Apple sharing a building with a Sony office in 1977, though I could've sworn I've seen a longer discussion somewhere: http://books.google.com/books?id=W4_P8LDZOmAC&pg=PT38. I think Jobs has actually said somewhere that the iPod was intended to be the "Walkman of the 21st century".
I think the best thing here is that GNUStep is GPL - any improvements - and there will be many - bringing it to the same functionality level as Apple's equivalent will be also GPL'ed and open for all.
GNUstep is actually licensed under the LGPL, so applications that link against GNUstep libs are not required to be similarly licensed. If GNUstep were GPL, however, the app would also need to be GPL'ed.
This is how Trolltech used to charge money for commercial use of Qt. Qt was dual-licensed under the GPL and a commercial license, so GPL'ed software could freely use the library, but non-GPL software would have to pay for the commercial license.
As for linking proprietary code do GPL'ed libraries, the license is not explicit. However, according to Larry Rosen (http://en.wikipedia.org/wiki/GNU_General_Public_License#Poin...) linking, even statically, does not make a derived work (that would be subject to being GPL'ed by accident). Also, the GPLv3 has some different wording. I understand the GPLv3 makes it less likely a program linked to a GPL'ed library will be considered derivative.
"The GNUstep libraries are covered under the GNU Lesser (Library) Public License. This generally means you can use these libraries in any program (even non-free programs) without affecting the license of your program or any other libraries GNUstep is linked with. If you distribute the GNUstep libraries along with your program, you must make the improvements you have made to the GNUstep libraries freely available."
The thing that surprises me about this is that I took a look in at GNUStep recently and it kind of seems... unloved. I imagine Sony's had its work cut out bringing things up to date. Also, I would be interested to know how much GNUStep code Sony will be using and how much it will be re-implementing. I've heard some pretty average things about the state of the GNUStep codebase.
GNUstep as a desktop environment has definitely not gotten the work it needs (the core of the desktop experience, GWorkspace, just got its first official maintainer in years), but the core has been fairly well maintained. They've got pretty much complete support for Obj-C 2.0 features and LLVM clang in a runtime that was written from scratch, and the whole thing is cross-platform (including Windows support).
To me, the most interesting aspect of this is that Sony did all this without communicating with the GNUstep community. We don't yet have a clue what the extent of their changes are, or what components were used or not used, so it's impossible to say whether Sony's work can directly benefit GNUstep.
My impression is that Objective-C is pretty big (for a minor league language) in the US, but that elsewhere in the world its use is nearly non-existent.
Most places I've lived you could throw a shoe and hit a Java developer or a Windows (VB/.net) programmer, but you could set a nuke off and not kill any Objective-C programmers.
I think NeXT and Apple never really bothered promoting it outside of the US. Australia has just about the highest level of iPhone usage per head of population, but prior to the iPhone coming out jobs for Objective-C were running at something like one per N-thousand Java jobs. Now the iPhone pops up, and all of a sudden everybody wants an Objective-C programmer with 5 years experience and proven commercial success on the iPhone... well... guess what... you reap what you sow. Because virtually nobody in Australia invested in Mac programming for all those years, you now don't have a deep pool of talent to dip into.
But a similar thing may happen with Android ... anyone that didn't invest in Linux programmers is at a disadvantage.
Trust me... it was rare as hen's teeth in the US also, especially during the period after Java came on the scene but before the iPhone.
(In 2001 I found myself looking for work with 10 months' of Java experience, and 8 years of NeXT experience. I doubt anyone hiring had any clue what Objective-C, NeXT, OpenStep, EOF were. Certainly the HR drones didn't.)
Back in the NeXT days, I think interest was primarily in US/UK/Germany, with some interest in France and Japan and maybe some Swiss activity due to the investment banks.
As far as android is concerned, I'm not sure that Linux experience provides any advantage. The libraries are all standard java stuff or android-proprietary. The development tools are also well supported on all platforms (AFAIK).
I vaguely remember that there was an Objective-C environment in one of the 90's game consoles. I want to say PlayStation, but it could have been one of the Sega consoles. Anyone know more?
When I was working at Sega, I innocently asked Toshi Morita if the Saturn port of GCC could support Objective-C. He nearly freaked out since it was already as much as he could handle to keep it working with C++. I never mentioned it again...
Not exactly the same, but the PS2 Linux Kit shipped with GNUstep by default. So at least some people at Sony have always had a soft spot for the platform.
SNAP is Sony's Networked Application Platform and is intended to be the basis for future consumer electronic devices. It is planned to evolve this into a complete software stack that will allow third party applications to run on Sony consumer electronics products. This Linux based software stack will include operating system, device specific middleware, general and device specific Objective C APIs including GNUstep, and key applications. The SDK will provide the necessary tools to begin developing Objective C applications for the SN Application Platform.
On a typical CE device hardware resources are much more limited than in the average PC or even a netbook. In order to get the most out of the available hardware it is therefore necessary to control access to scarce resources and manage their use and coordinate sharing them between applications. This is one goal of the SNAP stack. The other one is to provide the application developer with easy to use libraries to access all capabilities of the underlying hardware without reinventing the wheel.
That, and Cocoa was not designed by morons - protocols and delegation, not inheritance, rightfully rule the day.