Posts Tagged ‘iOS’

Xcode 6.1이 가져온 큰 변화 (Integration of iOS & Mac OS X for programmers )

놀랍게도 내가 보기엔 Swift가 아니다.

Xcode 6.1이 가져온 큰 변화는 드디어 Mac에서도 iOS의 View Controller를 쓸 수 있다는 것.
이렇게 함으로써 주가 되는 뷰들을 각각의 xib에 넣고 관리하게끔 유도한다. (물론 그 xib들을 storyboard로 관계 맺어주는 것이지. 사실 storyboard의 등장은 xib를 없앤게 아니다. 스토리보드 파일을 열어보면 그 답이 나온다.)
당연히 덩치가 큰 프로그램은 메모리 사용량 관리에서 더 좋아질테고. 이렇게 함으로써 사실상 Mac과 iOS의 가장 큰 차이점이 없어지게 되었다. 물론 프레임워크의 차이야 조금은 있지만, 그건 마치 다른 버젼의 프레임웍 같은 느낌이고.. 사용 패턴등에 대해선 차이가 없다.

자. 이제 맥에서도 NSView가 NSWindow의 상위에 놓일지는 의문이다. 그렇게 되면 프로그래밍적으로 iOS와 OS X의 통합이 될거다.
NS*대신 UI*를 사용할텐데, 어느 쪽으로 통합될까? 혹 UI*는 아닐까? NS는 NextStep의 흔적이니까.

그래도 난 NS가 좋다. 왠지 UI*는 뭐랄까.. 히피같은 느낌?

그리고 iOS에선 Core Foundation 혹은 그 레벨에 준하는 것들을 Foundation보다 더 많이 사용하는데 (UIColor vs. CGColor) 앞으로 iOS 이전의 프레임워크처럼 CF*에 해당하는 해당하는 것에 대응하는 UI*를 빨리 빨리 만들어 주었으면 하는 바램이다.

Difference between Objective-C compilation for Mac and iOS for 32 bit

For past several years, Apple people tried to make kernel for Mac and iOS identical.In Snow Leopard, they got rid of PowerPC code and started to introduce 64 bit kernel.

On Lion, as far as I know, kernel version of iOS caught up that of Mac OS X. So, after that it’s widely said that kernels for iOS and Mac are the same.
However, Apple recently introduced 64bit iOS for iOS 7. So, although the kernel for Mac was built as 64bit, iOS was built 32bit so far.
Anyway, the trend is that Cocoa becomes similar and Objective-C compiler becomes similar in feature list they support.

However, I noticed that 32bit compilation for Mac is not up-to-date compared to that for iOS.

Here is the difference between the two.

32bit compilation for Mac OS X (on the left) and iOS (on the right)

32bit compilation for Mac OS X (on the left) and iOS (on the right)

Although the overall direction is to make them identical on Mac OS X and iOS identical, it turns out that Apple didn’t update 32bit compilation for Mac to match that for iOS. The 64 bit runtime on Mac OS X is so called ‘modern run-time’. But should the run-time different for Mac and iOS?
I’m not sure because I don’t know the detailed story about what happened at Apple.

Because I have worked for iOS recently, i thought that 32 bit build environment for OS X would be the same to 32 bit environment for iOS. But it was not.
Apple’s official feature matrix can be found here.
What doesn’t catch is that modern runtime is only for 64 bit on Mac.


Spark Inspector : Its flexibity

Ok.. Using Spark Inspector was very easy.

At first there was one odd thing : Where is the SparkInspector.framework?
It’s located in Spark Inspector app. However, if using its “Framework Setup assistant”, the app ingest frameworks to be linked for a chosen project for you.

One thing I didn’t like was that.. as most Cocoa programmers tend to set up a Xcode project, it puts framework setting in “target” Build Settings rather than “Project” Build Settings.

link settingIf you want to set those links as default for all targets you end up adding, the easier way is to put that setting in “project” Build Settings. I know that many Mac/iOS app developers who usually jumped into Mac/iOS app dev. after iOS became popular don’t understand the different necessity between the “Target” and “Project” Build Settings.
Anyway, this is not a big deal.

Now, let’s talk about cool things.

Spark Inspector app

As you can see from the above picture, your app runs on a simulator or an actual device while debugging, but the Spark Inspector app intercept that also and displays view hierarchy in list form on its left pane, visual representation of it in the middle pane, and finally, properties of chosen view/layers on the right-most pane. So, while you are debugging and interacting with a real device or a simulator, you can change colors, tints etc by choosing a layer/view on the Spark Inspector and changes some properties of them.
You don’t need to paint your layer/view in your code for debugging purpose anymore!
Isn’t that cool?

Also, the most important and useful feature of this app is the 3D display of those views and layers. When one view occludes an underlying view completely, and if there is some transition is applied to the underlying view ( unexpectedly ), you can still check what’s happening visually.

You can say that recursiveDescription is good enough. (It’s undocumented message of UIView )
However, when there are lots of view/layers, it can be very hard to identify which one is which when you use the message. However, this great helper tool for UI debugging eases debugging very easy.

After the PaintCode, I think this is a must-have tool for iOS developers!
I’ll buy it soon.

BTW, there is one little glitch there. As you can see, the right side of the right-most pane invades the right border of its super view. So, the content on the right-most pane goes beyond the right border.

Maybe Apple people consider Mac OS X/iOS hybrid apps?

The last time I looked up documentation on “How to build iPhone/iPad hybrid app” was 2 years ago roughly. (When did the 1st gen. of iPad come out?)

At that time, I think there was no “platform key” in this identifier format in info.plist.


According to a section “updating your info.plist settings” in “Creating a Universal App”, it says :

For apps that run only on iOS, you can omit the platform string. (The iphoneos platform string is used to distinguish apps written for iOS from those written for Mac OS X.)

Hmm… what does that mean? Are they preparing a unified executable file format like they did for 32/64 and Intel/PowerPC for iOS and Mac OS X?
So, you can choose whether a project is built for Mac OS X or iOS, or the both?
Surely, iOS uses ARM instruction set in Mach-O format ( I believe ) while Mac OS X uses x86/x64 instruction set in Mach-O format. then…. yeah.. it can be possible to have one bundle.

Hmm.. if they announce a Mac-iPad hybrid device, it can be interesting. I like Lenovo’s effort on this with Windows 8.


difference in auto synthesis and “traditional” synthesis

After the introduction of iPhone SDK, many things have been changed very quickly.
Some were changed very publicly, while some didn’t. Even though there is no explanation on those changes, you could figure out some by taking a look at basic template code which Xcode puts by default.
However, some are hidden.
I was away from iOS/Mac development for a while and tried to catch up stuff recently. One of the simple but astonishing one was the internal member variable Objective-C preprocessor creates.

For example, let’s say that interface file looks like this.

@interface JAWindowController : NSObject

@property (retain, nonatomic) IBOutlet UIToolbar *toolBar;
@property (retain, nonatomic) IBOutlet UIButton *buttonOne;
@property (retain, nonatomic) IBOutlet UITextView *textViewOne;


Then when ARC is off and auto synthesis is used, it creates those internal variables which matches their properties with “_” prefixed to the property name.

#import "JAWindowController.h"

@implementation JAWindowController

- (void)dealloc
	[_buttonOne release];
	[_textViewOne release];
	[_toolBar release];
	[super dealloc];

However, if explicit synthesis is used, it will create the default internal variable names same to their property names like this.

the red underlines means a variable with such a name doesn't exists.

the red underlines means a variable with such a name doesn’t exists.

When ARC is on, it’s the same like the case of ARC is off. The only difference is that it doesn’t create dealloc message for you automatically, because usually it’s not necessary because Objective-C objects are to be released automatically by ARC mechanism.

Getting camera resolution on iOS devices

Apple doesn’t provide easy to use class or messages to retrieve camera resolution. However, people are saying that there is a way to get the information.

So, based on that, I wrote this code.

- (void) setupVideoCaptureSession
	self.isSetUpOK = true;
	self.cameraCaptureSession = [[AVCaptureSession alloc] init];
	self.cameraCaptureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
	bool isBackCamera = ([self.cameraCaptureDevice position] == AVCaptureDevicePositionBack);
	NSError *error;
//	NSLog( @"position : %@", ([defaultCaptureDevice position] == AVCaptureDevicePositionBack)? @"Back":@"Front" );
	if (isBackCamera)
		self.cameraCaptureDeviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.cameraCaptureDevice error:&error];

		if( error == nil )
			// Start of bulk configruation :It's not really needed here, but just in case...
			[self.cameraCaptureSession beginConfiguration];
			if( [self.cameraCaptureSession canSetSessionPreset:AVCaptureSessionPresetPhoto] )
				self.cameraCaptureSession.sessionPreset = AVCaptureSessionPresetPhoto;
			// End of bulk configuration
			[self.cameraCaptureSession commitConfiguration];
			[self.cameraCaptureSession startRunning];
			self.isSetUpOK = false;
		self.cameraCaptureDevice = nil;
		self.isSetUpOK = false;

- (CMVideoDimensions) cameraResolutionDimension
	CMVideoDimensions resolutionDimension = { 0, 0 };
	if( self.isSetUpOK )
		if( self.cameraCaptureSession.isRunning )
			NSLog( @"capture session is running.");
			NSLog( @"capture session is not running");
		for( AVCaptureInputPort *port in self.cameraCaptureDeviceInput.ports )
			if( [port mediaType] == AVMediaTypeVideo )
				self.cameraCaptureInputPort = port;
				CMVideoFormatDescriptionRef descRef = [port formatDescription];
//				resolutionDimension = CMVideoFormatDescriptionGetDimensions([port formatDescription]);
				resolutionDimension = CMVideoFormatDescriptionGetDimensions(descRef);
	return resolutionDimension;

If setupVideoCaptureSession and cameraResolutionDimension are called in that order, it should retrieve the camera resolution dimension in pixels. However, it doesn’t. A guy told me that it’s a bug. So, I filed it.

But I’m not sure. This is so apparent. How come Apple guys pass this stuff through their SQA process? As a person who have know their quality, it should not happen. Nowadays I notice lots of bugs in their tools, frameworks, etc. What’s happening in Apple nowadays?

What happened to Apple?

After lots of struggling at home yesterday with the sudden expiration of iOS 6.1 beta 4, now iTunes displays “Check Update” and “Restore iPhone…” buttons.

iTunes now displays "Check for Update" and "Restore iPhone..." buttons

iTunes now displays “Check for Update” and “Restore iPhone…” buttons

The iTunes didn’t display it yesterday, i.e. Jan. 27th, 2012 Pacific time.
Why it didn’t display those buttons yesterday?

Let’s point out a few things.
When a beta image was expired in previous versions ( I’m not just talking about iOS 6.x or 5.x. I have full experience with iOS since it’s beginning. ), they put some “allowance” or “cushion” days sufficiently. So, even though its following beta image was not installed, the old version was a live. If what I remember is right, in 3.x versions of iOS, I installed beta 2 and didn’t did so with beta 3 and after official public version was announced, I updated it to the public version. There was no problem in using the iPhone/iPod touch with the beta 2.
However, yesterday, my iPhone suddenly displayed “Activation Needed” message. I was mostly out of my home, so I didn’t know if new version was released on Jan. 26th, which was Saturday.
Then when I came back home, my iTunes didn’t display “Check for Update” and “Restore iPhone…” buttons. So, although I downloaded the latest beta image, I couldn’t update my phone. My iPhone just became bricked.

Who are leading Apple’s development and SQA teams nowadays? After iPhone got popular, I started to feel their quality of work degraded gradually. I understood that they lacked in their work force. I’ve heard that only finger-countable number of people worked on Xcode, while at MS about 250 people work on Visual Studio. I know that many good people in Mac OS X team wanted to move over to iPhone team. So, I expected such lower quality job. However, I believed Apple people. They will cease urgent fire and will be back to normal mode. However, it turned out they didn’t.
I file bugs to Apple’s bug report pages. Some are easily noticeable problems.
Even though some are beta, their internal SQA team should test things thoroughly and publish to developer community. The outside developers are not their SQA team. Although we test their S/W programs, the main focus is different. I’m not saying that their S/W programs should be perfect. They are also human. They can do mistake. However, I feel that their quality degraded seriously compared how they were before. Well, if we call it nicely, it’s social SQA.

I, personally, do three steps of testing of my own code.
While implementing, I frequently debug what I implemented to make sure if it works as I designed. Then when I finish implementation, I debug it to see if it works as a whole first, and do another test to see if it breaks any related features. Then I hand-over to SQA team.
So, there has been no bugs once they left my hands. I’m not saying that I’m always perfect. However, I at least try to ensure what I do. If there is only things I don’t test thoroughly although I implemented is some features of which designed behavior is not yet set by people who requested it. So, it’s kind of rough implementation to the point which can be ground for whatever they ask for the feature.
I don’t want to say some “great-sound” terminology like “Test-Driven Development”
Even we don’t use such term, that is common sense.

(Strangely, since 2000, people in this field just invent some nice-terminology to mean the same old thing. It looks to me that they try to impress other business background people or some S/W programmers who don’t have background in CS in a way that they know a lot or they are professionals. However, you know what? Although that can help office politics or impress during hiring process, actually those people who make things work are those who have those knowledge melted into their habit, so can’t even spend their time to learn those terminologies.)

Let’s look at what Apple announces. Xcode, Mac OS X, iOS.. they contain lots of bugs which are very easily visible.
Where are those Apple’s unique integrity, and perfectionism?
It’s not Steve Jobs they lost. It’s those integrity and perfectionism.

Great tip to figure out view hierarchy

According to objcguy, you can figure out view hierarchy with one call.

To simulate iPhone camera on your iOS simulator…

Although I didn’t test it yet ( I’ll do shortly ), this project looks very interesting.
I know iOS 5 simulator is not worth while to trying at all. Screen rotation, applying CGAffineTransform etc blah blah don’t work on iOS 5 simulator as it does on a real device. Sometimes the buggy iOS 5 simulator works more correctly for certain features than real iOS 5 devices.

Now even with iOS 6, it can be handy to test your code with a simulator without hooking up your iPhone.

So, let’s try this to see how far it can go!
UIImagePickerController-SimulatedCamera from the holly GitHub!

NOTE : Currently it doesn’t work.

Views of UIImagePickerController

Here is very good article on “How UIImagePickerController is composed of”.

%d bloggers like this: