Windows Presentation Foundation is Half-Baked

by dhovel 11. August 2013 23:29

Having written a fairly major WPF app, I think I can now write about it with some understanding.  While a well-crafted WPF app can be changed in amazing ways without much trouble, I think Microsoft has made some serious errors.  WPF remains a challenging platform, and one that may well not be worth the trouble.

The most spectacular aspect of WPF is data binding.  When apps can be designed from a completely data-centric viewpoint, WPF is a viable platform choice.  I shudder to think about “real” apps such as Vegas Video being written in WPF.  Data is important, but WPF is so inextricably bound to its data that flexibility is often left behind.

Some really ridiculous problems haunt WPF.  Why can’t resources be forward referenced?  This causes all manner of confusion (StaticResource vs. DynamicResource?) and reminds me of the early days of ‘C’ when I wrote all my programs backwards to get reasonable type-checking.

Resource problems plague WPF development.  Trivial changes to XAML result in crashes with absolutely no diagnostic information.  I’ve used dozens of code generators over the years and I’ve never seen a worse one.  Why can’t at least some file/line information be available for resource exceptions?  I find myself keeping “instant backups” for those times when the world ends without a clue.

Data binding is a mysterious as it is powerful.  Much of what I know I’ve gleaned by trial and error—put it in a {Binding} and see what happens.  AFIK, binding seems to rely entirely on run-time reflection: every name and qualifier (“X.Y”) is tediously parsed and resolved through indirect method invocation.  Why can’t WPF allow “static” as well as “dynamic” bindings?  That is, a “static” binding would be resolvable at compile time.  The primary DataContext for a UserControl or Window could be optionally declared in the XAML so that bindings marked “static” were resolvable at compile time; declaring a binding “dynamic” would explicitly bypass compile-time analysis.

Being a C# programmer from way back, I just don’t think that XAML will be truly useful until “lambda” expressions are allowed in XAML.   Why can’t “real” code be used without programmatically constructing bindings?  Just repeat after me: “It’s a code generator.”

How about all-important “property” notifications?  Well, there are two independent systems: INotifyPropertyChanged and DependencyObject.  Both require extensive coding and are highly error prone due to the use of “magic” strings.  The web abounds with “helper” classes to avoid some of the ugliness, but full AOP would be a blessing.  The model is simple: decorate a property with an attribute and bingo: the “set” accessor is only invoked when an actual data backing field change is about to occur, and the associated property changed event is invoked automatically.  There are commercial products that do this, largely through IML “weaving”, but they are costly, non-Microsoft solutions.  IMHO, the entire notification framework of WPF is wrong-headed.

And how about the vaunted “modifiability” of WPF controls?  Yes, you can put GroupBoxes around your ListBox items, but you’ll spend hours getting things right.  Why hasn’t Microsoft made it easier to see the “PART” inventory of complex UI objects and provide a slew of nicely themed controls?  The best example is this:  the code required to create a significantly distinct main window in daunting: hundreds of lines of XAML and C# just to override or alter elements.  And when your meticulously crafted XAML doesn’t work, will you know why?  No.

And don’t bother trying to use Expressions, since it fails to load all but the simplest VS WPF solution.  I’m really unclear why as to that product was introduced at all.

Nearly last but not nearly least, VS and WPF are both serious memory and processor hogs.  The layering of reflection-based late-binding on top of a managed framework creaks and groans incessantly.  Just because there’s Moore’s Law doesn’t mean Microsoft should assume that every serious developer has an octo-core machine with 16GB of RAM.

Finally, the acid test of the WPF approach is this:  How easy is it to support “skinning” and themes in WPF?  Answer: really, really, exceedingly hard.  This statement should speak for itself.



Partitioning GPT Disks for Bootcamp

by dhovel 12. May 2013 16:52

After some learning, here are some notes about partitioning large, external GPT disks for use on Macs running Bootcamp.

Disks larger than 2 terabytes must be organized as GPT (Guid Partition Table) disks. This newer structure not only allows disks up to 9 TB to be used but also allows direct specification of several partitions without "extended partition tables".  GPT-based partitions, however, are more complex and require "guard" partitions of empty disk space and "dummy" MBR information.  Both Windows and OSX can create and deal with these disks, but strange things can happen that prevent correct allocation.

I found that OSX had the easiest method for partitioning a GPT disk.  The "Disk Manager" in OSX allows a choice of from 1 to 16 partitions, along with a "drag" slider to apportion the space.  I tried to use this to create a dual-partition disk with one partition for the Mac (HFS+ Journalled) and one for Windows (NTFS).  I rebooted to Windows to convert the larger partition to NTFS.

This seemed to work, and both partitions were recognized on my Macbook under both Windows and OSX. However, when moving the external drive to another Windows machine, the other Windows would not recognize the NTFS partition.  I verified that only my Macbook with the Bootcamp drivers would correctly recognize the NTFS partition.

Then I tried the same thing with the "Disk Management" tool under Windows. The problem here was that after rebooting to OSX, the Disk Manager would not allow me to reformat the smaller partition as HFS+; I kept receiving an "insufficient space" error.

Finally I was able to accomplish this under Windows.  Using Disk Managment, I removed all partitions on the disk.  Then I allocated a large NTFS partition and a smaller one as NTFS (destined later to be HFS+), but I left a sizable "void" of unallocated space beyond the smaller partition.  Rebooting to OSX, I was able to convert the smaller partition to HFS+.  The resulting NTFS partition is recognized by other (non-Bootcamp) Windows machines.

It appears that 1) the Bootcamp drivers allow Windows to recognize disk configurations that other (non-Bootcamp) Windows machines do not, and 2) the OSX Disk Manager has a bug.  It will not reduce the space of a partition to provide for the "guard" partition that GPT allocation requires.  By leaving unallocated (empty) space beyond the second, smaller partition, the Disk Manager was able to carve the "guard" partition from the unallocated space and correctly erase (reformat) the partition for HFS+.

The only remaining problem is that the Bootcamp drivers in Windows don't recognize the GPT-based HFS+ partition; I keep getting a message that "the disk in drive X: must be formatted for use in Windows".  I just ignore this error and everything seems fine.

YMMV.  Good Luck.


Hey, Deniers: It’s the Same Science

by dhovel 8. October 2012 21:02

There has again been much press about prominent figures in American life (read: politicians) who deny facts.  They deny Evolution, the Big Bang and Global Warming, among other things.  A few years ago, I personally woke up to realize that both my dentist and my CPA were “Young Earthers”, a position which I find laughable.  Possibly a CPA might have never have been exposed to in-depth information about biology or geology, but was this possible for a dentist?

Denying scientific theories such as Evolution in their totality is the same thing as denying cell phones, computers, space travel or jetliners.  What is the basis for this statement?  It’s the same science.

The “scientific principle” is the idea that true knowledge is only obtained through a process of formulating hypotheses, testing them, and discarding and improving them.  These tests might entail actual laboratory work, experiments in the “wild”, wind tunnels, swimming pools or today’s modern computer simulation environments.  Often, as was the case with Darwin and Einstein, it may involve only “Gedankenexperiment” or thought experiments: reasoning that can be performed by a knowledgeable person in an armchair.

Given that there are few differences between the science that allows jetliners to stay aloft and the conclusions of modern geology and biology, those who deny Evolution, the Big Bang and other well-established theories are, in essence, denying heavier-than-air flight and cellular telephony!

Darwin’s theory was based on a lifetime of observation and experimentation, adjoined with the work of hundreds of others.  Due to societal pressures, he declined to publish his work until Alfred Russell Wallace was about to publish his own results which almost exactly paralleled Darwin’s own.  This was a classic example of Sherlock Holmes’ principle that what remains after all the impossible and unlikely events are removed must be the truth.  Two men from opposite ends of the earth had arrived at exactly the same intellectual juncture.  At roughly the same time, Gregor Mendel was performing simple, daily experiments with pea plants.  Though largely ignored in his lifetime, his results would establish a mathematical basis for inheritance long before they could spell DNA.

Geologists Charles Lyell and Georges Cuvier, although differing in their theoretical approaches, strongly established the principle that “the rocks don’t lie.”  The full normalization of the seemingly competing theories of uniformitarianism and catastrophism did not occur until early in my life, during the 1960’s, when the theory of plate tectonics was propounded and elaborated.  Once again, the proofs were obtained by using chemistry, atomic physics and rational analysis using the enormous mountains of evidence (literally) to be found all over the earth.

The theory of the Big Bang was first formulated by Georges Lemaître, a Catholic priest, early in the 20th century.  As developed through the work of Edwin Hubble and George Gamow and the scientific discoveries of Penzias and Wilson, the nascent Big Bang theory was validated to the point that it predicted not only the elemental composition of the universe but the numerical value of the background temperature of the universe!

So we come to Global Warming.  Mathematical modeling of weather is a very difficult and ongoing theoretical effort.  As with any theory, not every observable phenomenon can yet be predicted by current models; however, the extent and nature of the changes in the world’s macroclimates are reproduced faithfully in computers around the world by scientists from a wide variety of backgrounds and approaches. 

Boeing corporation, as an example, has computer models sophisticated enough to predict the behavior of huge new jetliners under almost all possible aerodynamic conditions without ever building a single plane.  When the U.S. Department of Defense issued a challenge to the aeronautical industry to build the next generation fighter plane, the so-called Joint Strike Fighter, both Boeing and Lockheed-Martin designed their planes entirely on computer.  Both resulting planes were constructed as complete, ready-to-fly prototypes based on the computer generated plans.  These prototypes flew and met almost all of the mandated specifications, which were complex and demanding.

This same approach has been applied to earth sciences.  Yes, there are more variables.  No, you cannot re-run history to see how changes in variables change outcomes.  However, every model carries with it a level of confidence.  This level is based not only on how closely a model matches observations but on how often the model, re-run with changes to initial conditions, reproduces a result similar to previous runs.  Almost all research in nuclear weaponry has used and still uses the same modeling methodology.

The truth is that those who deny these proven theories always have another agenda.  In the case of Evolution and the Big Bang, the agenda is almost always the fear that acceptance of the truth will somehow undermine religious belief.  I find this hard to accept, since during my extensive Catholic school education, I never once heard a priest or other teacher challenge these theories.  Recently, a world-wide conference on the Big Bang was held at the Vatican.

The deniers of Global Warming have a different agenda.  They fear that accepting anthropogenic global warming will result in trillions of dollars being spent on unproven mitigation techniques.  Some, it may be said, are merely angry at the notion that humans can have such a deleterious effect on the earth.  I personally don’t think that fear of the consequences of the truth should be a rationale for denial of the validity of a theory.  We cannot “cherry pick” science and accept what we chose simply because it’s all the same science.

It is time to treat deniers as the crackpots they are.  I, for one, intend to use a litmus test when choosing professional assistance in the future.  If I’m planning to use a lawyer, doctor, dentist, accountant or other professional in my life, I will ask that person if she believes in Evolution or the Big Bang.  If the answer is in the negative, I will decline to use her services.  This is simply because anyone who can compartmentalize reality to such a degree as to deny proven science is to be suspected of a form of modern neurosis that could have negative consequences in the performance of her duties.

However, I’m not going to use Global Warming as a litmus test simply because it’s a very complex topic with many aspects that are poorly understood.  For example, I don’t believe that we can or should spend those trillions of dollars mitigating CO2 emissions with suspect methodology, at least until more evidence is gathered; we should, however, do everything we can to use proven mitigation techniques.  I doubt very much that “cap and trade” will prove to be a viable mitigation technique, and evidence accumulated since the Kyoto treaty seems to bear out my opinion.

In summary, then, science is science.  It’s a process, and some theories are currently more complete and powerful than others.  Those who deny Evolution or the Big Bang should immediately stop flying on airplanes, using cell phones, watching TV or surfing the internet.  That would only be consistent with their worldview.  Those who deny Global Warming should read the latest evidence closely: it’s all there.  The problem is the commingling of science with public policy.  We should accept the results of science and base our policies insofar as possible on results that are proven well enough to support significant investment. 

Remember that no theory, even gravitation, can produce an “exact” answer beyond about 17 significant digits.  However, such an answer allows us to fly rockets to Mars and use Global Positioning Satellites to discover our location to within a few feet.

Tags: , ,


Your Dog

by dhovel 27. December 2011 13:40

Your dog is not particularly cute.

Dogs can be very cute, but they are similar to people in that only about one in twenty is genuinely attractive.  If others find your dog to be cute, that’s great.   But it’s not a good topic of conversation.

Your dog is not particularly smart.

Most dogs are of roughly the same intelligence.  They can do certain clever things but are usually blissfully ignorant of the obvious.  Unless you have a trained Border collie, your dog is probably just average.  Please remember the “Lake Woebegone” effect: they can’t all be good-looking and above-average.

Your dog is nice—to you

Dogs have been bred for 70,000 years to distinguish between those who feed them and strangers.  Your dog’s behavior toward you is no indication of how it will behave toward others.  Only training and monitoring will make your dog a truly safe member of society.

Your dog is not a child.

If you want to carry your dog in a belly pack or a stroller and coddle it all day, that’s your prerogative.  However, please refrain from recounting all the myriad ways your dog is like your child.  It’s not.

If I’m in busy traffic and I see a dog in my car’s path, I will try to avoid hitting it.  But I will not deliberately collide with nearby cars or risk a rear-end collision to avoid the dog; doing so might cause an injury or risk a human life.  If, on the other hand, I see a child in my car’s path, I will do everything I can, including jumping lanes, to avoid her.  After all, the nearby drivers and I are protected by tons of steel, seat belts and air bags.  And, after all, it’s a child.

Your dog is an “it”.

Most dogs these days are neutered, and dogs, as a species, are not dimorphic.  I am not interested in whether your dog used to be male or female.  Please do not correct me if I get the gender wrong, since I just don’t care.

Yes, your dog’s bark is annoying.

Every dog’s behavior is a reflection of its owner.  If you would not personally bark at people the way your dog does, remember that we’re all thinking it’s really you doing the barking. 

Please don’t expect that others can distinguish all the marvelous nuances of your canine’s self-expression (playfulness, uncertainty, boredom, etc.) because they vary by dog and are mostly just the owner’s interpretation.  Almost all barking sounds aggressive because it’s supposed to.

Dogs are wonderfully trainable.  Many attack dogs are trained to approach an intruder completely silently.  If a Rottweiler or Doberman can be trained that way, your dog can be trained not to bark at strangers from the back seat of a parked car.

Yes, your dog needs a leash.

Voice command is generally a myth.  Given the right temptation, almost all dogs will dash off after a rabbit, another dog or just because of high spirits.  If the law or common sense says use a leash, do so.

If I’m walking, running or riding my bicycle, I don’t want to have to pay laser-like attention to your dog to make sure I don’t stumble over it.  And please keep your leash out of the path of others.  Imagine blocking other people’s path with a long stick—it’s the same thing.

No, your dog can’t come in my house.

I really don’t care if your dog just came from the beauty parlor, I don’t let dogs in my house.  That is my right.  Whether or not you allow your dog into bed with you, or allow it so sit on your lap while you drive is entirely irrelevant to my decision to exclude animals from my house.  I won’t take offence at your driving with a pooch in your lap if you don’t take umbrage at my decision to not clean up after your dog in my own home.

When you go to other peoples’ houses, please be prepared to keep your dog in your car during your visit or to tie it up outside. 

Yes, I do like dogs

I like dogs; I always have.  But like everything else in life, I appreciate the good ones and regret the bad ones.  And I don’t really want to know every aspect of your dog’s importance in your life.  I understand what having a dog means—it’s a roughly similar experience for everyone.  Dogs are pleasant adjuncts to the lives of many people, but they are only a part of life.  If you find yourself droning on incessantly about your dog, please take up knitting, computer programming, gardening, French or scuba diving.  Life is short.



Kindle doesn't catch fire

by dhovel 7. November 2011 13:46

I just finished reading Steven Pinker’s wonderful new work, “The Better Angels of our Nature” on my wife’s (classic) Kindle.  I had been looking forward to having the work on a digital reader, but my expectations were dashed by the error-prone and primitive nature of today’s reader technology.  In brief, here are my primary complaints.

Pinker’s work is replete with graphs and charts.  On the Kindle, these render poorly, ranging from barely readable to laughable.  Since they are the core of the first half of the book, this is a significant limitation.

The rendering is poor.  The distinction between optional (line-break) and mandatory hyphens is entirely lost, leading to words being squeezed together and requiring decryption by the reader.  The footnote markers are sometimes rendered in a superscript (as would be expected), but often are just numbers stuffed into the current text line.  Pages have large gaps and pointless line-breaks which break the reading flow unnecessarily.

You cannot link from a footnote reference to the footnote and back.  I was shocked to find that the “Back” button doesn’t really go ‘back’ in a web browser sense.  If I jump to a stored bookmark to re-read a page, ‘back’ doesn’t go back!  Wow!

Almost the worst of all, this is a technical book, so its index and notes are very important.  Yet the notes do not allow you to jump to their original pages and the index is entirely useless!  In 2011 I would expect the index to have links to the actual references and allow me to iterate over the references to, say, Hobbes, looking for the quotation I wanted.  Instead the index is just a list of words.

All in all, I won’t be buying any technical works on the Kindle in the foreseeable future.  Unless the Kindle Fire significantly enhances rendering, page movement, footnotes, bookmarks and indexes, my advice is to stay with paper for books you respect.   It may be fine for the latest bodice-ripper or serial killer tome, but for serious (or even amateur) scholarship it just isn’t ready for prime-time.   How sad!


The Budget Debacle: Eliminating Taxation ‘Elites’

by dhovel 8. August 2011 18:20

The current disgusting impasse in the Beltway regarding the long-term resolution of the budget deficit is being treated as a political football, with nonsense and self-aggrandizement on both sides.  I feel compelled to add yet another personal viewpoint to the debate.

There are clear necessities to change significantly the current formulation of entitlement programs, reduce the size of the Department of Defense and discover new sources of primary governmental revenues.   Whether you think the ratio of cuts to revenue increases should be 4 to 1, 3 to 1 or 1 to 10, both sides of the ledger must urgently be addressed.  The current rate of income taxation falls below the historically low levels of the Eisenhower Administration.  The projected growth of the myriad entitlement programs is terrifying to anyone paying attention.

My survey of history shows that a clear sign of the decay of many systems can be traced to the exclusion of groups or elites from tax liability.  Currently, only 50% or so of American families pay any income tax.  The currently low capital gains rates and low income tax brackets allow the richest families in the U.S. to avoid significant tax payments.  Poorer families generally pay no net tax after deductions and credits are applied.  Many major corporations, despite record earnings, end up paying no corporate income tax at all, in spite of our seemingly high corporate tax rate vis-à-vis other industrialized nations.  The limitation or exclusion of these ‘elites’ from taxation must end.

One clear path to renormalizing the tax structure would be to create a base tax rate; that is, a minimum percentage that all taxable entities, personal and corporate, would pay regardless of other applicable deductions or loopholes.   A minimum rate of 5% would reincorporate fully half of American taxpayers into the revenue stream.  This would mean that every individual or corporation would always pay at least 5% of income as tax, regardless of any deductions, exemptions or credits.

It would be hard to argue with such a low percentage of taxation.  Such an approach would reincorporate every significant taxable entity into the federal tax revenue stream.   Along with significant changes to entitlement programs and reduction of America’s defense posture, the path to realistic deficit reduction would be open.


Microsoft as Patent Troll

by dhovel 28. April 2011 17:57

First it was Amazon with the infamous “one click” patent, and later the dubious ‘i4i’ lawsuit against Microsoft’s use of XML.  Along the way, numerous trivial and obvious user interface or data preparation methodologies have been patented and protected by an out-of-control patent system.

Microsoft was often the victim of such manifest injustices, but now it is perpetrating the same type of morally reprehensible litigation on Barnes & Noble’s “Nook” e-reader.  Trying to patent the tabbed dialog or cut-and-paste should be like trying to catch the wind.  These facilities are everywhere, and have existed in one form or another for forty years.

Personally, I can understand Oracle’s Larry Ellison playing fast and loose with Java licensing, or the owners of the Novell patents trying to milk (i.e. bilk) the system.  But Microsoft?  Granted, Microsoft has never directly espoused Google’s “do well while doing good” mantra, but at least its actions have rarely been based on such market-driven submarining as the Barnes & Noble suit.  I guess MS’s failure to achieve a position in the tablet market has led it to trying to hold successful tablet vendors for ransom.

Shame on you, Microsoft.  After releasing .Net for open specification and quasi-embracing open source, you had to pull this one.  Just develop a better product and let the market decide.  The unpleasant truth is that IP lawyers have enough work to do in our litigious culture without meritless lawsuits from multi-billion-dollar corporations.


Blog | Patents

Comcast DNS Abuse

by dhovel 4. November 2010 02:37

In the middle of debugging some code on my Mac, I discovered that Comcast has done it again.  Acted against the interests of its customers, that is.   My Mac code, intended for an iPhone app, was "pinging" various web sites; I discovered that non-existent web sites showed up as actually present.   At first I thought it was my app, so I kept trying to look for bugs.  After a while, I brought up the Mac OS X "Networking" utility and found that virtually all non-existent web addresses were being redirected to address  Not only was this bogus node replying to pings, but it would act as a spurious web site by servicing HTTP requests over port 80!

In other words, Comcast has changed the way the internet behaves in order to push advertising.  Of course, in their propaganda they claim that this is some sort of "service" to users to give them alternatives, but in reality it's just another intrusive and error-provoking means of earning revenue.

I later found that there is an option in my Comcast account to disable this unwanted "helper" service.  I've turned it off, of course, and as yet I see no change in behavior.  Maybe their expensive databases will take a week or two to propagate an update from my account to their overly clever DNS servers.   Here is the explanatory the link at Comcast.

It's sort of like going into a restaurant and ordering sushi, only to find canned tuna on the plate and a note saying that you were given lower-quality food because it may be lower in mercury.  The fundamental behavior of the internet is important and should be protected from the vagaries of oppressive marketing and the never-ending quest for yet another dollar.

Maybe I'm just too old for this crap.

Later note: I woke up this morning to find that their "opt out" feature actually works.  At least they got that right.  I'm dubious about all this, but at least I don't have to use it.

Update (10/2011):  I just found that the "DNS Helper" service had been turned back on; I have no idea why.  I went back to Comcast to change it again and discovered that it's very important to log in with your full Comcast email account name; otherwise, the "DNS Helper" option won't appear.  It seems they want to make this very difficult.



Windows Phone after a week or so

by dhovel 28. October 2010 01:10

Well, my pseudo-dummy Windows Phone app is almost done.  As a proof of concept, it falls quite short since sockets aren't supported in WP7.  However, I thought I'd blog again to review the good, the bad and the ugly.

First, the good.  Many things do work.  Since WP7 and Silverlight 4 are build on XAML, databinding does work, so all of reflection, etc., also must work.  Most of the other usual supects work, such as Regular Expressions.  I found (again) that the new XDocument classes are a treat to code for and serialization works very weil.  My whole app's state and all current data are just swooshed out to Isolated Storage and back in again via serialization.  I think it's almost a necessity to have Expressions 4 along with Visual Studio, and bopping back and forth between them is old hat for me by now.  C# is still the best language (for its class) out there, and Linq is supported, which surprised me.

Now the bad.  Again (from previous posts), no DataTriggers, no sockets.  Also, the Emulator doesn't persist app storage across executions, making complex data management almost impossible (who wants to use the cloud just to test?).  XAML is still, IMHO, a very near disaster.  It's completely un-type-safe, which in 2010 is almost a joke.  In addition, the whole process of creating well-behaved property notifications is completely tedious.   Since I don't have DataTriggers, I had to work around such common issues as selecting a graphic for an item in a listbox based on conditions.  To do this without triggers, I created a pseudo-property called StateImage that my XAML can bind to.  This means two things.  First, I have to create a bunch of these non-property properties to help the UI work like it should.   Second, I have to manually propagate notificiations to cause XAML to rebind.  It's about time that XAML and C# supported declarative bindable properties so that OnPropertyChanged, etc., would happen without tedious coding.  Maybe I'm missing something.  In the same vein, XAML really needs to support more than just public properties.  Why can't I pass parameters if they're statically defined, such as integers?  Have you actually looked at the stack trace in a data binding call?  It's horrendous!  If the system were truly type-safe, these calls could avoid the gnarly depths of recursive reflection.   Then we'd have the same situation we have in C# 4.0:  If you know about the class at compile time, it's type-safe and renders compile-time errors; if you don't it's dynamic and renders run-time errors.  XAML these days is entirely dynamic.

Finally, the ugly.  There's no popup window!  This is nothing short of amazing, but luckily I found a great blog post by Shawn Wildermuth that showed me how to drag in code from Silverlight 3.0, sprinkle some of his sugar on it and produce a nice-looking (and zooming) popup window.  I guess the Metro gurus really don't like informative UIs, or they believe we've entered a golden era of failure-free networking.  

More ugly: there's no ComboBox!  Another delightful broomstick up the **** from the Metro guys.  Again, I found a blog post by Alex Yakhnin linking to a download of his PickerBox.  While it's still a little error-prone, it generally does exactly what it should.  I really have to wonder when WP design people decide that a UI element that is absolutely ubiquitous must simply be eliminated.  This is no way to win hearts and minds. 

Still more ugly: the ApplicationBar, which is inserted into every WP page's XAML automatically, is really not ready for prime time.  First (of course) the UI gods have decreed that there shall be no more than four buttons.  OK, I guess that's their job.  However, the buttons and menu items aren't Framework Elements, and that means that you can't easily work with them.  The namespaces appear to be a little hokey, so you have to qualify things too often.  A word to the wise: you'd bettter use the blessed Microsoft ApplicationBar icons or strange things can happen.  It's also funny that buttons can appear correctly in Expressions but come up as big X's in the Emulator.

And more.  Page navigation is based on a web mode, where pages are represented by URIs.  That's OK, but when I want to pass parameters I have to resort to global variables (I just say no) or pass query terms.  There's no query term parser, so I have to code all that dreck by hand (thanks to regular expressions).  If they really insist on a URI model, why can't there be a type-safe HTTP POST capability that allows one page to parameterize another in a flexible way using serialization?

All in all, there's still the aroma of a rush to market by a team of people too convinced of their own wisdom.  I'm not going to sign up as a developer or get a phone until sockets work.  I hope.

Tags: , ,

Windows Phone, again-- less than it appears

by dhovel 23. October 2010 01:56

Well, I've spend quite a few hours with the Windows Phone SDK, Expressions Ultimate and Visual Studio 2010 over the last week.  The bottom line is that in spite of some very nice elements, the available assemblies (libraries) for Windows Phone display a distinct rush to market.

My princpal complaint are the egregious limitations on XAML in the WP7 SDK.  I wrote a fairly interesting media player application in WPF and came to respect and appreciate the data-centric WPF model.  After a few false starts I had an app that responded automatically to changes in the core datasets of the program.  The key to this was data triggering of XAML styles.  But guess what-- DataTriggers are not available on WP7!  Microsoft continues to ballyhoo "model-view-controller" (MVC) methodology and praise data models, but without DataTriggers you cannot write a truely MVC application for WP7. 

For example, to load a ListBox in "real" WPF you can specify the styles of the items, their containerization and visibility entirely from the data, using Binding to apply the content and DataTriggers to control both styling and structure through templates.  This is exactly what is needed in many cases.  In fact, it was exactly what I wanted for the WP7 app I want to write.  Instead, I'm left with the need to write oodles of plain old C# code to create listbox items, nest them and add them to the appropriate list.  This is very 1990's-style graphics development.

I have other beefs.  Why, pray tell, can you not persist IsolatedStorage elements across debugging sessions?  In my app I want the user to configure a list of sites.  I'll have to do that manually every time I restart the emulator!  This is very serious and seriously stupid.

As mentioned in a previous blog, the WP7 SDK is crippled with regard to sockets.  I had to write a proxy web site to perform standard operations (like ICMP ping) that the WP7 cannot do.  As I wrote before, I'm very suspicious the motivations for this.

Given the crippled nature of this release, I'm seriously debating waiting until a future release to get into WP7 development.  Microsoft has had years (and years) to get this right.  It's looking as though it's heading the right direction, but it hasn't reached critical mass yet.  IMHO.