Sunday, 30 November 2008
Sunday, 16 November 2008
NUnit was designed to be used with managed apps. So what?
All you need to do in order to test native code is create a C++/CLI project to host the test files. To link the test project to the native C++ project, simply add a configuration to the solution and call it ReleaseUnitTests for instance. This configuration will build the native C++ project as a static library as opposed to an executable. The C++/CLI test project links to this library and can call into any public method of the native project.
Job done! Who needs cppunit?
- Install NUnit.
- In Visual Studio Pro, create a native C++ project and call it CatHouse
- Add a class called Cat with a public method void Feed()
- Add another project to the solution: choose Visual C++ > CLR > Class Library and call it CatHouseTests. This will create a mixed assembly containing both native and managed code.
Add a C++ managed class called CatTest. Declare it public because we want it to be a managed type that can be seen by Nunit. Add a public method called TestFeed().
- Open CatHouseTests project properties, go to Common Properties > References and add the nunit.framework.dl assembly.
- In CatTests.h, add the line using namespace NUnit::Framework.
- Now you can add the [TestFixture] and [Test] attributes to the class and test method declarations respectively.
- By default the CatHouse project has a Debug and Release configurations. Add a new one called ReleaseUnitTests: in Build > Configuration Manager in the Active Solution Configuration drop-down, select New. In the dialog, type ReleaseUnitTests and in Copy settings from choose Release. Click OK. Now all projects have a configuration called ReleaseUnitTests.
Using Configuration Manager, ensure that the Debug and Release solution configurations build CatHouse only and not CatHouseTests. Ensure that ReleaseUnitTests builds both CatHouse and CatHouseTests, both having project configuration ReleaseUnitTests.
- Open the properties of project CatHouse and in Configuration Properties > General, select Configuration ReleaseUnitTests. Change configuration type to Static Lib (that will make it possible to link all the content with the test program).
- Make CatHouseTests link to the .lib generated by CatHouse. In Project Properties of CatHouseTests, go to Common Properties > Add New Reference > Projects > CatHouse.
- Add a #include "Cat.h" in CatTest.cpp and change the project properties so that it knows where to find the include.
- In the TestFeed() method, instantiate a Cat object on the native stack and call its Feed() method.
- Build the solution in Release mode then in ReleaseUnitTests mode.
- In the NUnit GUI type Ctrl O, select the CatHouseTests.dll. The TestFeed test should appear in the tree view.
Friday, 14 November 2008
- now I know there is a descent framework for unit-testing native C++ code in VSTS2008 Dev. Ok you have to write C++/CLI but the next best thing is cppunit so... VSTS2008 Dev is apparently the best tool around for the moment.
- database unit-testing in VC9 DBPro where it creates a C# class for you that automatically calls a T-SQL sproc where you write your test.
- other tools: I had an overview of the tools available for managing unit-tests and IOC containers: Pex, TypeMock, TestDriven. They're on my list of things to try out next.
- It's worth digging into SpecExplorer 2007 and see what can be done with it. The idea of creating a simplified state model of your system and having the tool generate all possible paths for you is very interesting.
- trusting your tests
- designing the app (because it forces you to take the point of view of the client)
- implementing stuff that works partially as opposed to not at all
- code quality
- documenting the code
- reducing integration testing time, reducing number of bugs in production
- add attribute [Test] to methods, [TestFixture] to classs
- naming: tests must have a good name to document their intention: method + scenario + expected behaviour.
- arrange (instatiate objects)
- act (call the method under test) ,
- assert (check the expected result)
- Make it Fail: write a test that fails because the method is not doing what's expected. Purpose: test your test
- Make it work: with minimum code, make the test pass by writing anything even if it's stupid as long as it doesn't break any other test.
- Make it better: refactor but do not add functionality and make sure the test still passes.
- Want to add more functionality? Ok but write another test first.
- use Testdriven to run test by right-clicking method in the middle of the method. Apparently TestDriven allows you to run a test even if you don't have a fully working executable.
- If you use VSTS, use the built-in MSTest because it brings a lot of integration benefits. It runs slower though. Usually unit-tests are kept in a separate VS project.
- Then you import the schema from an existing database. VSTS automatically creates a folder struture for schema objects in the solution folder. Each db object has got its own script therefore you can commit them all into CVS if you need to (you don't have to use Foundation Server).
- To add an object, right-click Add Table in the Schema View. This generates a script skeleton that you edit as you see fit (add columns, add options, etc...). Save the file and the Schema View is updated automatically. Change an object in the Schema View and the script is updated automatically.
- Choose a target database to compare the offline schema to the target schema.
- A view appears that shows all changed, new or missing objects.
- Select one of the changed object in the lists and view the difference (new column, changed line in sproc, etc..)
- You can also preview the DDL script that VSTS is about to generate
- You can enable/disable individual differences from the list if you don't want them to be part of the script or force some to be in the scripts if you really want them to be there.
- Export the DDL script to a .sql file then edit if you need to tweak it before testing it against a pre-prod database.
- You can create a unit-test for a sproc.
- VSTS automatically creates a test class that calls the sproc that contains the unit test. All you have to do is write the T-SQL for the sproc that does the unit-test. DB Sprocs tests are therefore perfectly integrated to all other tests.
- You can attach a Pre-Test and Post-Test to a sproc test in order to put the db into an known state.
Thursday, 13 November 2008
- coded UI tests (where you write UI tests using automation as opposed to record/replay)
- Database unit-tests to test T-SQL stored procedures.
- Generic tests. This is actually an external program that will look like an ordinary test from within visual studio. You specify the command line arguments to call the external program. You have the option to redirect the standard output / error to the test results or not.
- Load tests. That's quite fun: you can simulate heavy load conditions with many users. Among other things you can define a load pattern (constant load or increasing load) and the distribution (percentage of appareance for each test).
- Manual tests: simple text file describing a manual test procedure. This is there for auditing purposes only obviously.
- Ordered test: specify a list of tests to be run in a specific sequence in the situation where order counts (that's for integration tests only, unit-tests should not be order dependent)
- To write a fancy UPDATE statement that needs a complex subquery, use WITH (common table expression)
- To keep 2 tables in sync with one single statement, use the new MERGE syntax (new in 2008).
- To filter out the results of a ranking query without affecting the rank values, use WITH again.
Lots of little tests is better than a single big one There are no perfect models, only useful ones Start small: even very small models can find bugsDVP01-IS Model Based Testing with Spec Explore In the afternoon Keith did another demo of SpecExplorer 2007 followed by Q&A session.
- How does SpecExplorer interact with the System Under Test ?
- SpecExplorer is a free add-in to Visual Studio that will be available as a powertoy in the beta of VS2010.
Wednesday, 12 November 2008
if you're new to unit-testing, don't get into TDD immediately, start with writing a few unit-tests incrementally to get the hang of it.His powerpoint is here. TLA401 - Microsoft Visual C++ 2008 for Unrepentant C++ Developers The last time I saw Kate Gregory was at TechEd 2007 in Orlando. She hasn't changed: her demos go really fast :-). VS2008 SP1 comes with TR1 which is a set of proposed additions to the next C++ standard C++0x. It includes stuff that is currently in the boost library. Kate explained and demoed each of the following language additions:
- shared_ptr: safer, more intuitive, more powerful than auto_ptr.
- lambda expressions:  or [&] or [=] equivalent of => in C#. Allows you to define short functions inline. She showed how elegantly they integrate with STL.
- auto: equivalent of C# var so you don't have to spell out a type when the compiler can work it out by itself.
- TR1 additions
- New MFC classes, ribbon toolbar
- New libraries for parallel development
- Some Vista features are only available to C++
- C++ is the most practical language to do interop
- choose between sampling (takes snapshots at intervals) and instrumentation (records all function calls).
- instrumentation generates a call tree view
- performance reports show how much time was spent in each function.
- compile with the /analyse option to raise warnings highlighting potential coding mistakes. This tool is based on another one called Prefast.
- display the portions of code that were executed by the unit-tests.
- Partial runs possible
- No need for Configuration
- Consistent pass/fail
- Order does not matter
- Tests must be Fast
- Constructor (forces you to pass a concrete object)
- Property (passing the concrete object is optional)
- Factory (passing the concreate object is done as part of a factory method)
- Spring.net (I met one of the authors of Spring.net at TechEd Orlando)
- MSTest (the one integrated in Visual Studio Team System)
- Typemock isolator
Tuesday, 11 November 2008
- DB refactoring: changing a table name automatically updates all dependencies
- DB deployment: you can generate a script that contains all differences between the current schema and the production schema.
- DB testing: C# Unit-tests for T-SQL stored procedures where you define setup, test and teardown Test data generation (that's really handy for the setup part of the unit tests)
- see author/changes directly in code
- define check-in policies (to force you to pass unit-tests for instance)
- define triggers such as on check-in build
- setup build notifications popups (through powertools)
- code metrics: show the number of lines, cyclomatic complexity
- all of the above present in VSTS for DB, Team System Server is not required
- Planning, task management: creation of user stories / tasks hierarchy, capacity planning, reports to show test results per user story / task.
- Gated check-in: define check-in conditions, for instance you can't check-in if it doesn't build or if the unit-tests don't pass
- sexed-it up with the MFC C++ ribbon classes,
- parallelised a lengthy for loop across 8 cores with parallel_for
- and eventually showed off the multi-touch capabilities of Windows 7 by moving both pong racquets with his fingers simultaneously, under a round of applause.
Monday, 10 November 2008
Tuesday, 4 November 2008
Sunday, 2 November 2008
- your data is encrypted with AES, which is the least you should ask for, really.
- the really nice thing: it comes with a client that automatically syncs your PC folders with the remote storage.
- the sync works accross many PCs, which is cool as well.
- you can access your files online over the web.
- you can, and that's really convenient, send links to large files per email to other people. The recipient doesn't need to login to retrieve the file, he just clicks it. You get notified when he receives it.