TechEd North America 2013 Summary

Source : Windows Phone Developer Blog

We were really excited to be at TechEd North America 2013 this week, where we participated in 15 deep, technical sessions all about Windows Phone for IT pros and enterprise developers.

Windows Phone 8 is gaining enormous traction in the enterprise and this was evident in the excitement and enthusiasm of the thousands of TechEd attendees who attended our sessions. Here’s a list of the sessions, with links to the Channel 9 page for each session, where you can view recordings of the presentations.

Monday, June 3

Tuesday, June 4

Wednesday, June 5

Thursday, June 6

For those of you in Europe, we’ll see you at TechEd Europe 2013 in Madrid at the end of June.

Find out more about why Windows Phone is the best choice for today’s businesses at On this website we’ve collected critical info about Windows Phone collaboration, management, security, and enterprise app development features, along with links to deep, technical resources and white papers.

Declaring your DataContext in markup

Source : Windows Phone Developer Blog

This blog post was authored by Steve White, a Senior Content Developer on the Windows Phone team.

– Adam

Setting the DataContext property to a view model instance in your page markup can increase the “designability” of your pages in Visual Studio and Blend for Visual Studio. In this post I explain why, and demonstrate how it works.

So that a page’s UI elements can bind to a view model’s properties and commands, the view model is assigned to the page’s DataContext property. A common dev practice is to set the DataContext value for a page imperatively, using code-behind. Here’s an example from a sample app I created, called Bookstore.

  1. public MainPage()
  2. {
  3.     InitializeComponent();
  5.     this.DataContext = new BookstoreViewModel();
  6. }

The result is a sparse design UI, on both MainPage.xaml and in the Create Data Binding dialog.


I have databound TextBlocks and a databound ListBox in my page. I also have an item template, which is what I’m editing in the preceding screenshot. All this runs and looks great in the Windows Phone Emulator and on Windows Phone. But that constructor isn’t executed at design time, so my page is more challenging to style and to lay out.

Let’s see what happens if I comment out the DataContext assignment in my constructor and set the DataContext in markup instead. To do this, first open your XAML page. Then, in the Document Outline window, click PhoneApplicationPage; in the Properties window, click DataContext, and then click New. Click your view model type from the Select Object dialog box, and then click OK.


Here’s what my design surface in Visual Studio looks like now. Notice that the Path picker in the Create Data Binding dialog is now populated, based on the DataContext type and the properties that you can bind to.


The Create Data Binding dialog only needs a type to work from, but the bindings need the properties to be initialized with values. In my Bookstore app, I don’t want to reach out to my cloud service at design time so my initialization code checks to see whether the app is running in a design tool such as Visual Studio or Blend.

  1. if (System.ComponentModel.DesignerProperties.IsInDesignTool)
  2.     // Load design-time books.
  3. else
  4.     // Load books from a cloud service.

You could use a view model locator if you need to pass parameters to your initialization code. A view model locator is a class that you can put into app resources. It has a property that exposes the view model, and your page’s DataContext binds to that property. Another pattern that the locator or the view model can use is dependency injection, which can construct a design-time or a run-time data provider (each of which implements a common interface), as applicable.

If for some reason none of these options work for you, you could try implementing a design-time data context. That’s a bigger topic than we can cover here, but one way of doing it is to use the Sample Data from Class feature in Blend, and which is demonstrated a little before the 12-minute mark in the video in the Inside Windows Phone – Data binding blog post.

I’ll leave you with one last tip. Did you notice that Visual Studio generated the XML namespace prefix local for the view model instance? If local is already being used elsewhere, Visual Studio generates a prefix based on the last segment of the CLR namespace. What I like to do is to use the exact CLR namespace, but replace periods with underscores. This way I never get collisions, even if I copy-and-paste code across pages.

Un exemple de code pour le développement d’un hub d’entreprise est disponible sur MSDN

Source : Le blog officiel français Windows Phone de Microsoft France

blog entrepreneurs microsoft

Nous avons mis en ligne sur MSDN le 29 mai une exemple de code d’un hub d’entreprise. Plus complet que le hub « très minimaliste que j’avais proposé ici, ce projet en C# est prêt à être « buildé » avec Visual Studio Express. Il faudra néanmoins :

  • modifier  le fichier Applications.xml avec les applications d’entreprise que vous voulez mettre à disposition (le fichier est fourni à titre d’exemple de syntaxe avec des liens vers des applications qui n’existent pas),
  • utiliser un certificat dans  toutes les applications et votre hub d’entreprise afin que ces applications  se « reconnaissent entre elles» et que vous puissiez les déployer sur des téléphones.

Toutes ces étapes sont décrites sur le site.

A vos claviers !

Un site compatible Leap chez Nokia

Source : Monleap

Nokia a mis en ligne un site, qui vous permet d’utiliser le Leap pour faire voler un avion en papier dans des villes connues. On trouve notamment des villes Américaines et des villes comme Prague. Tous les possesseurs d’une version beta du LEAP Motion peuvent utiliser le site sans installer de module complémentaire. On l’a testé, les contrôles sont différents de ceux de Google Earth mais c’est plutot fun ! Voici la vidéo de présentation :

Le site en question

The New Generation Kinect for Windows Sensor is Coming Next Year

Source : Kinect for Windows Product Blog

The all-new active-infrared capabilities allow the new sensor to work in nearly any lighting condition. This makes it possible for developers to build apps with enhanced recognition of facial features, hand position, and more.By now, most of you likely have heard about the new Kinect sensor that Microsoft will deliver as part of Xbox One later this year. 

Today, I am pleased to announce that Microsoft will also deliver a new generation Kinect for Windows sensor next year. We’re continuing our commitment to equipping businesses and organizations with the latest natural technology from Microsoft so that they, in turn, can develop and deploy innovative touch-free applications for their businesses and customers. A new Kinect for Windows sensor and software development kit (SDK) are core to that commitment.

Both the new Kinect sensor and the new Kinect for Windows sensor are being built on a shared set of technologies. Just as the new Kinect sensor will bring opportunities for revolutionizing gaming and entertainment, the new Kinect for Windows sensor will revolutionize computing experiences. The precision and intuitive responsiveness that the new platform provides will accelerate the development of voice and gesture experiences on computers.

Some of the key capabilities of the new Kinect sensor include:

  • Higher fidelity
    The new sensor includes a high-definition (HD) color camera as well as a new noise-isolating multi-microphone array that filters ambient sounds to recognize natural speaking voices even in crowded rooms. Also included is Microsoft’s proprietary Time-of-Flight technology, which measures the time it takes individual photons to rebound off an object or person to create unprecedented accuracy and precision. All of this means that the new sensor recognizes precise motions and details, such as slight wrist rotation, body position, and even the wrinkles in your clothes. The Kinect for Windows community will benefit from the sensor’s enhanced fidelity, which will allow developers to create highly accurate solutions that see a person’s form better than ever, track objects and environments with greater detail, and understand voice commands in noisier settings than before.

The enhanced fidelity and depth perception of the new Kinect sensor will allow developers to create apps that see a person's form better, track objects with greater detail, and understand voice commands in noisier settings.
The enhanced fidelity and depth perception of the new Kinect sensor will allow developers to
create apps that see a person's form better, track objects with greater detail, and understand
voice commands in noisier settings.

  • Expanded field of view
    The expanded field of view accommodates a multitude of differently sized rooms, minimizing the need to modify existing room configurations and opening up new solution-development opportunities. The combination of the new sensor’s higher fidelity plus expanded field of view will give businesses the tools they need to create truly untethered, natural computing experiences such as clicker-free presentation scenarios, more dynamic simulation and training solutions, up-close interactions, more fluid gesture recognition for quick interactions on the go, and much more.
  • Improved skeletal tracking
    The new sensor tracks more points on the human body than previously, including the tip of the hand and thumb, and tracks six skeletons at once. This not only yields more accurate skeletal tracking, it opens up a range of new scenarios, including improved “avateering,” the ability to develop enhanced rehabilitation and physical fitness solutions, and the possibility to create new experiences in public spaces—such as retail—where multiple users can participate simultaneously.

The new sensor tracks more points on the human body than previously and tracks six skeletons at once, opening a range of new scenarios, from improved "avateering" to experiences in which multiple users can participate simultaneously.
The new sensor tracks more points on the human body than previously, including the tip of the hand
and thumb, and tracks six skeletons at once. This opens up a range of new scenarios, from improved
« avateering » to experiences in which multiple users can participate simultaneously.

  • New active infrared (IR)
    The all-new active-IR capabilities allow the new sensor to work in nearly any lighting condition and, in essence, give businesses access to a new fourth sensor: audio, depth, color…and now active IR. This will offer developers better built-in recognition capabilities in different real-world settings—independent of the lighting conditions—including the sensor’s ability to recognize facial features, hand position, and more. 

I’m sure many of you want to know more. Stay tuned; at BUILD 2013 in June, we’ll share details about how developers and designers can begin to prepare to adopt these new technologies so that their apps and experiences are ready for general availability next year.

A new Kinect for Windows era is coming: an era of unprecedented responsiveness and precision.

Bob Heddle
Director, Kinect for Windows

Key links

Photos in this blog by STEPHEN BRASHEAR/Invision for Microsoft/AP Images


Kinect 2, pour concurrencer le LEAP ?

Source : Monleap

Microsoft a annoncé il y a 2 jours sa nouvelle console, la Xbox One et en a profité pour dévoiler la Kinect nouvelle génération. Une vidéo de présentation des nouveautés se trouve en bas de l’article pour les flemmards :)

Les nouveautés

Plusieurs améliorations pour ce nouveau modèle, qui parait bien plus imposant que l’ancien (aussi bien terme de taille que de performances). Le capteur peut désormais détecter les doigts de la main, mais également les expressions faciales. Il peut également détecter le rythme cardiaque des joueurs en se servant de leur visage (on attend de voir). Vous pourrez désormais jouer à 6 dans la même pièce, et la Kinect 2 peut même détecter les mouvements en basse lumière.

Les objets autour des joueurs sont bien mieux détectés, et la latence est quasi inexistante. Le capteur peut désormais détecteur la force de vos mouvements, ce peut etre particulièrement interessant pour les jeux de combat.

Un vrai concurrent pour Leap Motion ?

En parallèle de cette annonce, Scott Evans, le responsable du programme Kinect a annoncé que ce dernier serait disponible pour PC avec un SDK adapté. Aucune information sur le prix sans la console, mais à la vue du prix de la Kinect 1, la nouvelle version sera plus chère que le Leap. Au niveau usages, le Leap n’a pas la même fonction que le capteur Xbox, car il est plus précis. Cependant le Kinect est interessant par sa capacité à detecter les mouvements sur une longue distance (actionner une playlist à distance par exemple), ainsi que son capteur vocale. Un autre point bloquant pour l’adoption du Kinect sur les PC est sa grande taille, là ou le Leap Motion est minuscule.

La vidéo pour vous faire une idée des nouvelles possibilités :

SYNERGIZ signe de nouvelles solutions interactives lors du congrès CFE-CGC

Source : Le blog SYNERGIZ

Les 17, 18 et 19 avril derniers, le syndicat CFE-CGC organisait son 35ème congrès annuel avec pour thème de fond l’innovation. A cette occasion, la CFE-CGC a fait appel à SYNERGIZ pour concevoir et développer des solutions d’animation et de communication interactives, destinées aux congressistes et adhérents, mais aussi aux journalistes venus couvrir l’événement.

La Roue de l’Innovation

Toute nouvelle création de SYNERGIZ, la « Roue de l’innovation » a été conçue pour permettre aux visiteurs de vivre une expérience à la fois ludique et innovante, à partir de deux bornes B-KI placées en deux endroits du forum. L’objectif est de proposer aux congressistes une animation numérique – une Roue de la fortune numérique – pour gagner des lots à retirer sur les stands. Les partenaires, quant à eux, sont associés à l’animation, favorisant en outre la fréquentation des stands.

La Roue de l'innovation par Synergiz

La Roue de l’innovation : Mode d’emploi

Développée avec la technologie Kinect, la Roue de l’innovation est une application pilotée par le geste, sans contact avec l’écran. Une fois détecté dans la zone de jeu délimitée au sol, l’utilisateur appuie à distance et en direction de l’écran, sur un buzzer virtuel, afin de déclencher la rotation. Lorsque la roue s’arrête sur le nom d’un des partenaires, le sésame qui s’affiche permet à l’heureux gagnant d’aller retirer son lot sur le stand indiqué. Chaque expérience d’utilisateur est immortalisée par la prise automatique d’une photo, au moment-même où l’utilisateur déclenche la roue.

Ecan d'accueil "Roue de l'innovation"

Solution « Collect & Share » sur SUR40

SYNERGIZ a décliné sa solution Collect & Share, en proposant deux usages distincts à l’attention de deux typologies de visiteurs attendus sur le congrès : les congressistes d’une part et la presse d’autre part. Ici, Collect & Share permet à un ou plusieurs utilisateurs simultanés d’utiliser la table multitouch SUR40, dotée de la technologie PixelSense, pour collecter et échanger des informations diffusées par la CFE-CGC. La solution favorise également la création de « lien social » grâce au partage de ces informations entre utilisateurs et à l’échange de contacts (collecte de cartes de visite virtuelles). Enfin, directement à partir de l’écran, chaque utilisateur peut s’envoyer les informations qu’il collecte, directement sur sa boîte email.

Application "Collect & Share" pour la CFE-CGC

Usages à l’attention des congressistes

Dans ce premier cas, l’écran tactile situé au sein du forum partenaires est intégré dans la table M-DY, facilitant ainsi l’emploi multi-utilisateur. Pour s’identifier, les congressistes utilisent leur badge personnel qu’ils scannent grâce au lecteur de flashcode relié à la table interactive. Les données personnelles de l’utilisateur sont alors extraites et une zone personnelle à son nom apparaît sur la table. Il peut alors commencer à collecter les documents de son choix.

"Collect & Share" avec lecteur de QRCode

Les usages pour les journalistes en salle Presse

Dans le second cas, la table est mise à disposition des journalistes dans la salle presse. L’authentification des journalistes s’effectue via le badge remis à leur arrivée, sur lequel est apposé un code spécifique (« tag ») qui leur est associé. Une fois le badge posé sur la surface de l’écran, leur zone personnelle apparaît. Directement sur la table, ils peuvent alors consulter, récupérer, partager et recevoir les documents mis à disposition par le service Presse de la CFE-CGC, mais aussi entre confrères, échanger leur carte de visite.

Salle Presse – Congrès CFE-CGC

Testing your Windows Phone app – Part 2

Source : Windows Phone Developer Blog

This blog post was authored by Craig Horsfield, a Senior SDET on the Windows Phone Test and Operations team.

– Adam

Testing your app throughout the development process can help you create a really great Windows Phone app. Testing helps ensure that your app is effectively represented in the Windows Phone Store as an app that offers Windows Phone users a high level of performance and quality. A small investment in the key areas described in this post can help you bypass common errors early in the development process, and help you get positive results in the long term. This post is part 2 of a 3-part series that outlines key test areas that you should consider before submitting your app to the Store. See part 1 for additional details.

Push notification and Live Tiles

Live Tiles

Live Tiles are updated through push notification or through an app’s periodic background agent. When testing these areas, you should accelerate the update time so that you can test more rapidly. For more info, see Tiles for Windows Phone.

Test scenario


1. Verify that your Live Tile updates.

Verify that the Live Tile updates after it has been pinned to Start.

2. Verify that your Live Tile stops updating.

If this setting is disabled in the app, make sure the Live Tile stops updating.

3. Verify that the Live Tile updates via a periodic agent.

If the Live Tile is updated via a periodic agent, verify the update on all network types, and verify that the Live Tile updates when there is no network, for example, in Airplane mode.

4. Verify that the Live Tile is working and present after an app upgrade.


5. Verify that the Live Tile is working and present after an app upgrade and subsequent restarting the device.

If updating the app, make sure that you don’t change the TokenID in the WMAppManifest.xml file. This results in your Live Tile being removed from Start when the device is restarted.

6. If using a background agent, verify that the agent doesn’t crash or terminate.

This results in disabling the agent and Tile updates will fail.


Apps that use notifications normally are used within background agents. Test these notifications to ensure that they work properly. For more info, see Notifications for Windows Phone.

Test scenario


1. Verify that notifications are received.


2. Verify what happens when you tap the notification.

Tapping the notification launches the user into the app in the correct state.

3. Verify that the app doesn’t overuse toast notifications.



Background agents provide key abilities for an app, but they also introduce some specific test considerations. Agents can be disabled and enabled in the phone’s settings, on the Settings > Application > Background Tasks screen. The app needs to be aware of the state of the agent. Resources available to agents also are restricted. A key point to remember is that when running in the debugger, these restrictions are not enforced so it’s important to test your app outside of the debugger and track the resource that it is using. For more info, see Background agents for Windows Phone.

Test scenario


1. Verify initial app start.

The agent starts when it’s needed.

2. Verify that the app handles state in which the agent has been disabled by the user in Settings\Application\Background Tasks.

Additionally, verify that the app performs as follows:

  • App notifies the user that it’s not available and continues to work as expected.
  • App notifies that the agent is needed and re-enables it.

3. Agents are disabled in low-battery conditions – test that the app can handle these states.


4. Agents that crash or are terminated by the OS for exceeding resources, on two successive crashes will be disabled. Ensure your app handles this state.

In this state the foreground app has to reschedule the agent. This state can be queried from the agent API. For more info, see Background agent best practices for Windows Phone.

5. Resource-intensive agent only runs when on power and Wi-Fi. Ensure app handles this correctly.


6. Verify that the agent stays below the required CPU and memory caps.


This table lists the required CPU and memory caps, by agent on Windows Phone 8:









11 MB



20 MB



10 MB

VoIP agent


60 MB

Media, audio, and video

Media, audio, and video throughout your app should be tested. Consider these test scenarios in the table below. For more info, see Media for Windows Phone.

Test scenario


1. Preserve audio state.

  • If your app plays an audio sound (clip) when it starts, your app should not pause the currently playing audio. The app should preserve and not interfere with the currently audio playing on the device.
  • If your app plays back audio content from a background agent or a foreground app, your app should pause any currently playing audio.

2. Verify the Universal Audio Control during audio playback.

  • Verify Play.
  • Verify Pause / Resume.
  • Verify Skip Next / Skip Previous.
  • Skip to last and back to the first track, etc.
  • Some apps may take time to process these calls. Verify that the UI is set to disabled while the app is processing these calls to prevent multiple invokes.
  • Verify volume controls.
  • Audio continues under lock.
  • Audio continues when app is on the back stack.
  • Verify all expected codecsthat the app offers can be played back.
  • Track info is displayed in the UI.

3. Verify audio playback via a background audio agent.

  • Test all of the above test cases.
  • Audio continues when the app is terminated but the background agent is allowed to continue.
  • Audio continues to play when device screen is locked.
  • App agent remains below the app 20-MB cap.

4. Media Source

  • Verify media from a network stream.
  • Verify media from the app’s ISO store.
  • Verify media playing from the media library on the phone. (This case requires the correct capability in the app manifest.)

5. Video playback

  • Verify all states:
    • Play from start
    • Pause, resume
    • Skip forward and back
    • Change states rapidly

6. S-Video playback in FAS scenarios

  • On navigate away event, the app needs to record the current media stream location.
  • Play media stream return to the phone start page and then switch back to the app. Verify the stream is preserved, should continue from the previous point.
  • Play media stream and force the app to tombstone via Visual Studio. In this case, the app should continue from previous point but will have to load the stream and forward it to the location saved in navigate away and deactivated events.

7. Bandwidth

  • Verify media stream playback on different networks and bandwidth. App should adjust playback quality and codec as need.
  • Verify network dropped scenario and the app handles this and informs the user correctly.

Visual studio phone emulator can manipulate the network quality and type to aid in this testing.

8. Media hub integration

  • Verify app is visible in the Media hub, manifest must have HubType=1.
  • App has to use MediaHistory and MediaHistoryItemclasses in this scenario.
  • Verify post market place ingestion where the correct capabilities are set on the app manifest to enable it in the hub. (Use the App Beta submission process to test this.)
  • App must update the, ‘now playing’ tile, ‘history’ and ‘new’ list.
  • Verify app plays correct stream when launched from history or new list.

9. FM Radio

  • App is not compatible with all hardware and versions of Windows Phone. Test app on the correct platform.
  • Verify that app sets the correct region based on the phone location; this enables the correct frequency stepping. If incorrect it will not tune well or find stations.
  • Set the correct power modes. For example, verify that the user is not playing media from the FM radio, then set RadioPowerMode to off.


With Windows Phone 8, you can create apps that use info about the phone’s physical location. Scenarios for location-aware apps include checking the user into a web service using the user’s instantaneous location, and tracking the user’s location as it changes over a period of time. The location data the phone provides comes from multiple sources, including GPS, Wi-Fi, and cellular. The visual studio phone emulator can be used to simulate these location changes. The location can be moved manually or simulate a sequence of location changes. For more info, see Location for Windows Phone 8.

Test scenario


1. Prevent PositionChange events from firing too often and placing CPU processing load on the app.

Set the MovementThreshold property to the appropriate value for the app needs and make sure that events fire only outside of that threshold.

2. Handle the unknown location state.

Ensure the app can handle NA values. Test the app when it has no location state.

3. Test a large change in location data so that any internal calculations in the app do not fail in these cases.

For example, position changes greater than 1 degree. This can happen if the phone has had no valid location data for some period of time.

4. Test app in all hemispheres: North/South and West/East.

Ensure that your calculations on negative degree values are correct.

5. Test 0-degree and 180-degree location for longitude and latitude.


Design considerations when using location:

Test scenario


1. Use a lower level of accuracy to save on battery power if applicable.


2. Check the Position.Location.IsUnknown and GeoPositionStatus properties to ensure that the location is valid.


Resource usage and performance

Windows Phone apps need to be designed to efficiently use and preserve the limited resources of the phone platform. You want to design your apps to use the least possible CPU cycles, to access networks efficiently and purposefully, and to make the best use of visual components—graphics, bright colors, and themes use more power than a simpler UI.

Test areas to consider:

Test scenario


1. Check for a nonresponsive or jerky UI.

This could be caused by long-running activity on the UI thread.

1. Check for Memory leaks – repeat scenarios multiple times to detect memory leaks during specific sequences.

Repeat page navigation could increase app memory.

2. Check for rapid battery drain.

Can be caused by using the app for long periods of time.

Tips and tools:

  • The Visual Studio app profiler, on the debug menu, is a key tool for looking into app memory and CPU usage.
  • Using the EnabledFrameRateCounter/EnableRedrawRegionscan be useful when app testing.
  • Use APIs in the DeviceStatus class track memory usage in the app, especially ApplicationCurrentMemoryUsage and ApplicationPeakMemoryUsage.
  • See App performance considerations for Windows Phone for an overview of app performance for Windows Phone.

In part 3 of this series, we’ll discuss additional areas and testing approaches to consider, including to network resources, device-specific tests for hardware variations, display resolution, app upgrade, common Store test cases, and real-world testing.

Le Leap s’invite dans le monde médical

Source : Monleap

leap-motion-medecineDans les utilisations que vous imaginiez pour le Leap Motion, certains avaient pensé aux applications possibles dans le milieu médical, et  notamment pour des opérations à distance. On en est pas encore là mais la société Scopis Medical a présenté son logiciel compatible Leap Motion. Celui-ci permet notamment de manipuler des radios, le Leap peut certainement régler certaines problématiques en terme d’hygiène dans le salles d’opérations, le chirurgien pouvant contrôler un ordinateur sans le toucher, et donc rester stérilisé.

On vous laisse découvrir les possibilités du logiciel en vidéo.

Testing your Windows Phone app – Part 1

Source : Windows Phone Developer Blog

This blog post was authored by Craig Horsfield, a Senior SDET on the Windows Phone Test and Operations team.

– Adam

Testing your app throughout the development process helps to ensure that your app has a high level of performance and quality. Testing also helps to make sure your app is effectively represented in the Windows Phone Store. A small investment in the key areas described in this post can give your app positive results in your market, and help bypass common errors early in the development process. This document will outline testing approaches for key test areas that you should consider.

Beta testing

After you have thoroughly tested your app using the emulator and on a Windows Phone device, it’s important to also test your app as a beta app. Submit your beta app for beta distribution, where you can continue to test your app based on the beta and code flow experience. Additionally, you can include beta tester to help review your app. For more info, see Beta testing your app and in-app products.

Installation and launch

Testing your app’s install and launch process is a vital area that can often be overlooked. By following these steps, you can minimize installation and launch issues that are not handled by your app. There are a few scenarios to consider when testing the initial launch experience and resume paths scenarios. One of these is the new Windows Phone 8 fast app resumescenario that a Windows Phone 8 app can opt into.

The following tests can help verify your app’s install, launch, and resume functions.

Test scenario


1. Verify the app on the first launch.

Note any End User License Agreement (EULA) and any other one-time notifications that appear in the launch UI.

2. Reinstall the app.

Verify that all one-time notifications appear again on a subsequent install and launch.

3. Start the app, and then return to the Windows Phone Start screen.

Launch the app again from a Live Tile, or from the phone’s App list.

· Verify the default scenario, in which the old app is terminated, and a new instance of the app starts.

· For Windows Phone 8 apps, if the app uses fast app resume for Windows Phone 8, in which ActivationPolicy= »Resume », then the app resumes instead of starting a new instance. Verify that the app returns to the correct page in the app.

Fast Application Switching (FAS) and tombstoning

When a Windows Phone app is moved to the back stack based on the app’s navigation history, the app is suspended. When the app resumes, it returns to the foreground. Apps on the back stack are tombstoned when there is resource pressure on the device. The exception occurs when running location-tracking apps in the background for Windows Phone 8. Windows Phone 8 location-tracking apps continue to run in this scenario. Your app needs to handle both of these cases, and you should test this function in your app. FAS occurs when the user leaves the app to go to Start, Search, or some other path based on the launchers and choosers you’ve used in your app to support navigation. For more info, see Multitasking for Windows Phone and App activation and deactivation for Windows Phone.

The following tests can help verify your app’s FAS and tombstoning functions.

Test scenario


1. Test the phone’s Start button and Back button.

Launch the app, and then press the Start button to verify that the Windows Phone start page appears. Press the Back button to resume the app. Verify that the app resumes in the required time, and in the state it was in before you navigated away from the app.

2. Test the Camera button.

Launch the app, and then press the Camera button to verify that the camera starts. Press the Back button to resume the app. Verify that the app resumes in the required time (10 seconds) and that is in the state it was before you navigated away from the app.

3. Test locking the phone using the Power button.

Launch the app, and then press the Power button to lock the phone. Unlock the phone, and then verify that the app resumes in the required time, and that the app is in the state it was in before you locked the phone.

4. Perform the same test cases for FAS and tombstone state.

To force a tombstone path, in Visual Studio, in the project Properties windows, click the Debug tab, and then select Tombstone upon deactivation while debugging.

5. Test deactivating and closing your app.

Make sure the app doesn’t take too long to process the Application_Deactivated and Application_Closing events, so that the app can save its state correctly. Avoid doing a lot of work in the Application_Closing code path, because processing time and access to the UI thread are limited.

6. Test background execution of your app.

For continued background execution in location tracking apps, you’ll want to test the app’s RunningInBackground event handler. In addition to running the preceding tests, check for the following conditions:

· Background execution occurs when the app enters the back stack.

· Start the app and move it to the back stack and resume the app. Verify that the app was not suspended and that it continued to run.

App navigation

You should navigate all paths in the app to test its navigation. You especially want to be sure that your app exits and handles navigation interruptions as intended.

Test scenario


1. Test the Back button.

Your app should exit when you press the Back button on the top most page.

2. Test navigation interruptions.

Your app should not try to initiate new navigation while your app is currently navigating. Test rapid navigations in the app, and navigating in the app while pages are loading.

Don’t throw unhandled exceptions to exit the app at a page that isn’t at the top of the page stack. Instead, navigate over the page by removing it from the back stack; you don’t want to display the page on the exit path, or call API available in Application.Terminatein Windows Phone 8 apps. This triggers an app crash, and a Watson error report. You might see the crashes indicated in the app reports available in the Store.

Launchers and choosers

Launchers in your Windows Phone app give users the ability to perform common tasks, such as launching the Bing Maps app, launching Media Player, or launching the Store or Marketplace. Launchers do not return data or status to your app. With a chooser, on the other hand, users can return data to your app. For instance, users can choose return info from the Contacts app. For all launchers and choosers, test your app to be sure the launcher or chooser works correctly, and that it doesn’t trigger related errors.

Test scenario


1. Test navigating to and from the launcher or chooser.

In the app, navigate to the launcher or chooser. Immediately go back to the previous page or screen. The app should return to the page and state the navigation was initiated from.

2. Test the complete scenario when using the launcher or chooser.

In the app, navigate to the launcher or chooser. Complete the scenario, i.e., select a photo or send mail. When the action is completed, the app should return to the preceding app state.

3. Test FAS scenarios when navigating to a launcher or chooser.

Navigating to a launcher or chooser can triggers an FAS scenario. See the related section earlier in this post for additional test scenarios related to FAS.

UI and layout

You need to test to make sure all elements of your UI and their layout work as intended, and meet app submission requirements, such as supporting Light Theme and Dark Theme. Test each control you use in your app, and the overall appearance of your app on the phone.

The following tests will help verify your app’s UI and layout.

Test scenario


1. Test your app in the Light Theme.

Ensure all text is readable and all UI is visible.

2. Test your app in the Dark Theme.

Ensure all text is readable and all UI is visible.

3. Test the UI for text clipping.

You may need to set the text to wrap or make other layout changes.

4. Test your Splash screen display.

If you use a splash screen in your app, verify that it’s visible when the app starts, and that it displays for the expected time.

5. Test each animation.

Ensure all animations are smooth. If they appear slow or stop unexpectedly, see if the app is doing excessive work on a background task or thread.

6. Test each screen orientation.

Verify that the layout looks as you intended in portrait and landscape orientations, and that each page can display and move between orientations.

In part 2 of this series, we’ll discuss additional areas and testing approaches to consider related to push notification; Live Tiles; background agents; media, audio, and video; geolocation, and resource usage and performance.