Brian Harry has recently released a new version of Database Professional Power Tools! You can read up on it here.

Two things really interest me:

  1. The ability to run code analysis on static sql code.
  2. The data generation wizard to create data generation plans.

 Static Code Analysis

You can use the SqlAnalysis built task to perform static code analysis. This basically allows you to do static code analysis on SQL code.

Database Generation Wizard

“You can use the Data Generation Wizard to create a data generation plan that copies data from one database and inserts it into another database. This data generation plan is useful if you want to use live data for most of your testing needs but make some minor changes to ensure privacy.” – Taken from documentation

Advertisements

I’ve been working on a few personal projects. One of them involved building an ASP.NET website. Being the good geek that I am, I thought I’d try out ASP.NET 3.5 and try using things I’ve never used before. That, unfortunately, proved to be my downfall. One of the new features I decided to implement was the ASP.NET membership provider.

From the articles I read, the membership provider seemed to take a lot of the grunt-work out of the mundane tasks such as creating users, handling roles, etc. No one ever told me what a chore it would be to extend, though. One glaring oversight on the part of the default membership provider was that there was no way to do a logical delete on a user. Sure, you can use some fields such as IsApproved or IsLockedOut but that’s really a workaround and not a true solution. There are so many instances where you want to “delete” a user while maintaining all the related records and that’s not possible unless you do a logical delete. This is one big failure in the Membership Provider.

Other gripes I’ve had with the technology are that extending the provider is an extremely painful process and requires an almost complete rewrite of the backend functions. In the end, I’m not saving any time by using this at all. In fact, it’s causing me more grief! I’m also tied to the default database layout for the membership provider. I do not want to use Profiles for storing user information. It’s totally meaningless as I lose the strong typing.

My conclusion… ASP.NET Membership Provider Fails!!! I am seldom so critical of any technology but the ASP.NET team who came up with this really should have a good hard look at what they’ve done, or rather what they failed to do. After spending many hours trying to shoe-horn the membership provider into my solution, I’ve decided to bite the bullet and just implement my own data structures and business logic.


I’m currently building a ASP.NET 3.5 website using LINQ as my OR Mapping Framework of choice. There are a few things which I’d like to share. One observation I have is that LINQ, in all its beautiful visual design, does not refresh its schema after I make changes to the database schema. I’ve tried using SQLMetal but in the end, it seems easier for me to just remove the table and add it again through the designer.

Second, it took me a while to work out just where LINQ fit in a multi-tiered application. I know the traditional way is to go with UI – Business layer – Data Access layer. That’s all well and good but LINQ seems to span both the business and data access layer. I found it tricky to separate them out and in the end, I decided to leave them in one layer.

While link generates the strongly-typed classes for you to access tables, very often, the developer will have to create their own classes representing those tables and add custom logic and additional properties. Here’s an example:

[Table(Name="UserInfo")]
    public class UserDetails
    {
        private EntityRef<User> _User;
        private EntityRef<UserMembership> _UserMembership;
        [Column(IsPrimaryKey = true)]
        public int UserInfoID
        {
            get;
            set;
        }

        [Column()]
        public Guid UserID
        {
            get;
            set;
        }

        [Association(Storage="_User", ThisKey="UserID", OtherKey="UserId")]
        public User User
        {
            get
            {
                return this._User.Entity;
            }
            set
            {
                this._User.Entity = value;
            }
        }

        [Association(Storage="_UserMembership", ThisKey="UserID", OtherKey="UserId")]
        public UserMembership UserMembership
        {
            get
            {
                return this._UserMembership.Entity;
            }
            set
            {
                this._UserMembership.Entity = value;
            }
        }

        [Column(Name="Work")]
        public string WorkPhone
        {
            get;
            set;
        }

        [Column(Name="Home")]
        public string HomePhone
        {
            get;
            set;
        }

        public string Address
        {
            get
            {
                StringBuilder addressBuilder = new StringBuilder();
                addressBuilder.AppendLine(AddressLine1);

                if (!string.IsNullOrEmpty(AddressLine2))
                    addressBuilder.AppendLine(AddressLine2);

                addressBuilder.AppendLine(PostCode);

                return addressBuilder.ToString();
            }
        }
In this example, you can see that there is custom logic used for the Address property. As such, in some ways, this object contains business logic information and yet is tied to the database. I've actually gone on to create actual business objects, which I might share if there's enough interest, which contain functions to return all the users, get users by username, etc. But as you can see, the delineation is not so distinct as when you would use a 3-tier approach.

Overall, I found LINQ to be new and interesting. In terms of how it stacks up against other OR mapping frameworks, I’d have to admit that while it might not be as good as some of the commercial ones, it definitely is a bold step by Microsoft into that arena. Personally, I quite like LINQ and I’m going to complete this website with it.

Technorati Tags:

Just today, a former colleague and friend emailed me to inform me of a blog he was writing. I had a quick look and was quite amazed at some of his insights and how he has grown. One thing I noticed was that he has broken out of the Microsoft mould and grown as a developer. The Paul I knew was almost exclusively a .NET developer and he was good at it.

Now Paul of today, though, seems to have adopted different technologies and really accepted his shortcomings and worked to overcome them. One of the things he’s delved into is language syntax and parsing. While I cannot claim to understand everything he’s doing, I found his project interesting and something that I’d like to know about.

Keep it up, Paul. I’d like to see how things evolve for you.


A Look at LINQ

12Dec07

I’ve been interested in LINQ for some time as I’m a bit fan of object relational modelling. I was initially introduced to the world of object relation mapping by a brilliant man by the name of Malcolm Young. He showed me that there was more than just the proverbial “ds.” and stored procedures. By using this framework, I noticed the ease at which applications were created. No longer are we hindered by the requirement of having to build a data tier. It’s also a lot nicer to do:

customer.FirstName

as opposed to:

dsCustomer.CustomerTable[....]

One of the big arguments against using LINQ is the security risk of having developers directly accessing the database. I cannot disagree with this argument. With enterprise level projects, DBAs will most often have ownership over a database and tightly control the manner in which it is accessed. It is possible to restrict access to just stored procedures in some casesMy argument against that is that very often, business logic invariably finds its way into stored procedures. Maintaining stored procedures is also harder than managed code. Furthermore, my original observation still stands; it is much easier and nicer to deal with objects rather than datasets and datatables. You can always call a stored procedure from LINQ and get back an anonymous object to work with. Should you not lock down your database, there still is nothing stopping a developer from building dynamic SQL queries.While it’s still early days, I’m very impressed with LINQ and I believe it has real possibilities. I cannot see Microsoft including LINQ “just for the fun of it” ino .NET 3.5. Someone must have had the vision to see beyond the limitation of always having to build a data tier and calling stored procedures.

Technorati Tags:


Today I was in a meeting with several TFS experts including Grant Holliday and Joe Schwetz. Joe presented on how Microsoft used TFS to handle their projects and the value it added to their organisation. I found it quite interesting to hear how Microsoft structured its development with TFS. Their branching structure, in particular, was interesting to behold.They had a structure that revolved around a Main branch which contained high quality code. Branching off from the Main trunk were the development branches that were created of isolating features.

image

Development branches are created for the purpose of isolating feature development. For Microsoft, Joe revealed that they have as many as 630+ feature branches. The feature branches coordinate their check-ins by scheduling dedicated check-in times for each branch. In addition, there is a process that Microsoft follow upon creation of a new feature branch.

The stages of the process are as follows:

  1. CP0 – The feature team writes a brief 1 page document outlining what they are developing
  2. CP1 – Design phase
  3. CP2 – Coding phase
  4. CP3 – Quality gates

There are 20 quality gates to be satisfied before a merge back to the Main branch is allowed.

Personally, I find it interesting to see why Microsoft prefer this branching strategy. From what I heard from Joe, this model works very well for them as it prevents incomplete features from being added to a solution. The quality gates ensure the code that is checked-in to the Main branch is of an acceptable quality. Furthermore, there are reporting benefits to be drawn from this model according to Joe. Unfortunately, my lack of knowledge with the reporting aspects of TFS leave me somewhat in the dark.

Overall, I was quite impressed with Joe’s presentation and I’m quite inspired to see whether I can introduce their philosophy into the projects I’m involved in. I can see how Microsoft uses TFS as a total package that is essential to the software development life cycle, rather than an optional element or mere source control as so many organisations treat it.

Technorati Tags: ,,,,


I have successfully integrated Siebel Tools 7.7 with TFS. It took a bit of fiddling but it works. The integration, however, is purely on a source control level. Currently I’m working on a way to associate work items through Siebel and it doesn’t look too difficult.

Siebel Tools has an option to enable source control (Visual Source Safe) and it checks code in and out via a batch file. It did not take too much customisation to get TFS working with Siebel Tools via the batch script. The hardest part was creating the workspace for the user and pointing it to the proper project branch.

Here’s what the script looks like:

PATH=c:\Program Files\Microsoft Visual Studio 8\Common7\IDE;%PATH%
SET LOG=C:\logpath\log.txt
set SOFTWARE=tf

set CHECKIN=%SOFTWARE% checkin
set CHECKOUT=%SOFTWARE% checkout
set ADD=%SOFTWARE% add
SET OPTIONS=-i
SET PROJECT=”$/MyProject/Development”
SET TFSSERVER=xxxxxxx

SET WORKSPACENAME=%COMPUTERNAME%_%USERNAME%

SET Action=%1
SET DIR=%2
SET Comments=%3
SET File=%4

ECHO CHANGING DIRECTORY TO %DIR%
CHDIR %DIR%

ECHO Executing %SOFTWARE% workspace /server:%TFSSERVER% /new %WORKSPACENAME% /noprompt >> %LOG%
%SOFTWARE% workspace /server:%TFSSERVER% /new %WORKSPACENAME% /noprompt >> %LOG%

ECHO %SOFTWARE% workfold %PROJECT% %DIR% >> %LOG%
%SOFTWARE% workfold %PROJECT% %DIR% >> %LOG%

if %ACTION%==checkout goto CHECK_OUT
if %ACTION%==checkin goto CHECK_IN

:CHECK_OUT
echo ============Check out file %FILE% from Source Control System============ >> %LOG%
if not exist %FILE% echo “New File” >> %FILE%
attrib -r %FILE%
echo Add %FILE% in case it doesn’t exist in Source Control System
%ADD% %FILE% “Added by System” %OPTIONS% 2>&1
echo Start checking out %FILE% from Source Control System >> %LOG%
%CHECKOUT% %FILE% %NON_COMMENT% %OPTIONS% >> %LOG% 2>&1
attrib -r %FILE%
goto END
:CHECK_IN
echo ============Check in file %FILE% into Source Control System============ >> %LOG%
echo Check in %FILE% into Source Control System
%CHECKIN% %FILE% %OPTIONS% >> %LOG% 2>&1
attrib -r %FILE%
goto END
:END
echo ===================End Of srcctrl.bat====================== >> %LOG%

Basically the script creates a workspace and maps the specified project to it. The rest is fairly straightforward. It performs a check-in and check-out a la VSS. One thing to note is that Siebel Tools executes this script for each file that is checked in and out. This means that you will potentially have a lot of records in the TFS data warehouse.

As stated before, the lack of work item association is a severe limitation with this script but it is easily resolved by writing some code to handle the work item selection and check-ins and check-outs. The prospect of writing a MSSCCI provider for Siebel Tools is an interesting challenge which I will take up, time permitting.

Technorati Tags: ,,,,