.NET Framework 3.5 is released! Among the new enhancements? System.Collections.Generic.HashedSet
It turns out no! Microsoft has chosen to not add an interface which matches this collections set based operations. There is no matching ISet interface which the new HashedSet implements. While some people will say "yeah but it still uses IEnumerable
The only thought I could come up with regarding why this obvious interface was left out is that if it is just an interface you can not guarantee that the collection behaves like a set. For example with a set you can not add the same item twice. If a custom collection were to implement said ISet and decided to let the Add method add the same item multiple times, what would stop them?
I suppose a set isn't something you can just define by an interface. Maybe I should let my emotions cool down prior to jumping over Microsoft for leaving out this interface. I guess I was just imagining the possibility of using NHibernate without having to add Iesi.Collections to my project. Then again, what would make me think that this would change, or change in a relevant time frame?
All that being said, I still wish they included the interface! After all there is an IDictionary which could theoretically be implemented in such a way that the custom collection could add the same key multiple times. If the Dictionary semantics can't be guaranteed by the already existing interface, then I say it would have been ok for such an interface to exist for the set.
--John Chapman
Thursday, November 29, 2007
Where Is System.Collections.Collections.Generic.ISet?
Posted by John Chapman at 5:23 PM 2 comments
Labels: Visual Studio 2008
Tuesday, November 27, 2007
VSTS 2008 Web Test Forms Authentication
With the release of Visual Studio 2008 I was eager to test the Web Testing functionality to see if my issues from 2005 had been resolved. If you remember, Visual Studio 2005 did not support forms authentication with its web test functionality (see original post: Visual Studio 2005 Team Edition Web Test and the follow up VSTS Web Test Error Found).
Seeing as there is a hot fix for Visual Studio 2005 to resolve this issue, I figured the issue would be resolved with 2008. I figured this was finally my chance to make heavy use of the Web Test functionality. I opened up Visual Studio 2008 Team Suite and gave a simple web test a shot. No go! I receive the same exact errors I received with 2005.
How is this possible? How has such a blatant issue existed for over 2 years now? I know Microsoft is aware of it (based on the existence of the Knowledge Base article) so why hasn't someone resolved it for the release of 2008?
I'm very disappointed by this. So at this point I'm stuck looking for a work around. I'm still looking for suggestions if anyone has them.
Note that this is really a cookie problem and not a forms authentication issue. The cookie is sent to the client but never returned to the server. When I find a suitable workaround I'll post it here.
--John Chapman
Posted by John Chapman at 7:37 PM 4 comments
Labels: Testing, Visual Studio 2008
Monday, November 19, 2007
VS 2008 Professional Download Horrors
This one is really strange to me. Today was the big release to manufacturing of Visual Studio 2008. I decided to try and download the VS 2008 Professional edition (I actually downloaded Team Suite Trial fist) only to run in to the fun of the Akamai Download Manager.
This thing caused me so many issues! I wonder if this is the first time Microsoft decided to use this product. I was trying to download it while using IE 7 in Windows Vista only to notice that the page just kept refreshing. OK, so eventually I realize that the pop up blocker must be blocking the download but not telling me. I disable the pop-up blocker and see the Akamai pop-up and then it requests that I install an AcitveX component. OK, not my favorite thing in the world, but for VS 2008, you bet!
The download manager launches and asks me where I would like to place the file. I say "Download" please, it says "Sorry, you don't have access to that folder." I don't? That is strange. I try my E: drive instead, same story. I then try my Documents folder, it looks like it works. Except I get an error message stating that it can not access my documents folder and instead will place it in Temporary Internet Files.
3.3 Gigs later the file is complete, only to notice that the file is invalid. I hunt it down in Temporary Internet Files and it can't be copied or opened. The OS says the file doesn't exist!
Ok, this is frustrating. After a few more minutes I took the actions I should have taken earlier, I opened up Firefox. Now I go to the same pages I went to before, and it just works. The download manager opens (as a Java Applet this time) and I select my download location, and it just works. Why was IE such a fiasco?
You would have thought Microsoft would have tested this on Vista with IE, wouldn't you? Thank you Firefox, I couldn't have downloaded Microsoft's product without you! Who would have thought?
--John Chapman
Posted by John Chapman at 8:09 PM 1 comments
Labels: Visual Studio 2008
NHibernate Access Performance Revisited
Two days ago I posted a blog titled NHibernate Access Performance. After making that post the results kept bothering me. First, the results did not seem accurate. I could not for the life of me figure out how the CodeDom optimizer performed so well when accessing private fields. I reviewed the NHibernate code over and over and could not determine how it possibly performed better than the basic getter/setter. Add to that my discovery from yesterday regarding the DateTime.Now Precision and I had to re-run these tests.
I learned that my instincts were dead on. The CodeDom optimizer for private field access isn't any faster than the basic field level access. It turns out I had a bug in my code where it was actually using the CodeDom property level access instead. I know, I know, I should be ashamed, I shouldn't write bugs! Truth be told, this was way too simple, I should have caught this earlier.
Now, due to the issue with the DateTime.Now precision issue, I decided to run my tests for 10,000,000 accesses. I figured at this point any precision issues should be insignificant. See the updated chart:
What's really interesting about these results is that the private field access now takes twice as long as the public property access when using basic reflection. This is more along the lines of what I would have expected. I do not know why 10,000,000 loops was enough to notice this difference but 100,000 loops wasn't. I wonder if there is some hit taken upon the first access of a property which is not present for a field? If I find out more about this I will write a new post.
--John Chapman
Posted by John Chapman at 7:13 PM 2 comments
Labels: C#, NHibernate
Sunday, November 18, 2007
DateTime.Now Precision
While re-evaluating the numbers from yesterday's post NHibernate Access Performance, I thought that the returned time in milliseconds seemed a bit strange. Specifically the repeat of 13.67 and 15.62 milliseconds. What are the odds that you see the exact same values twice while running the tests I was running? I started to wonder about how precise the DateTime.Now (or DateTime.UtcNow) values really are. I always assumed they would be updated once the ticks of the processor are incremented. It doesn't look like that is the case.
For fun try running a console application where you write the current time in ticks to the screen two commands in a row. They are exactly the same. At least they were for me.
Now for more fun try the following code:
DateTime now = DateTime.UtcNow;
while (now == DateTime.UtcNow) { }
Console.WriteLine(((TimeSpan)(DateTime.UtcNow - now)).TotalMilliseconds);
When I first wrote this code and tried it, it returned 15.624 milliseconds every single time (A common value I see while running my performance tests from yesterday). However, now when I run it I see .9765 milliseconds every time. Something is controlling the precision of the DateTime.Now and I have no idea what.
I would have expected something like this to have been publicized more. I've never seen articles written explaining the precision of DateTime.Now. The lack of precision seems like it could be an issue for systems which perform many transactions per second. It seems like it would be helpful to be able to guarantee order based on time. That unfortunately doesn't seem to be the case anymore.
As a result of these findings I think I'm going to re-run the tests from yesterday regarding private field access versus public property access in NHibernate. When ran for longer periods of times the results are actually a little bit different than what we saw before. Not different enough to change my conclusions, but different enough to be interesting none the less.
I've also been trying to hunt down how the reflection optimizer is actually helping with the private field access. From the code I see in NHibernate it looks like the performance should be identical to my non optimized getter/setter for the fields. Look for more information to come on that topic if I find it.
Posted by John Chapman at 1:45 PM 2 comments
Saturday, November 17, 2007
NHibernate Access Performance
*1/5/2008 Update - Source code is now available for download if you would like to test these findings yourself. Download Here*
*11/19/2007 Update - It turns out my instincts were correct. Upon further review of the code the CodeDom field getter/setter was actually using the CodeDom property getter/setter. I had a very hard time understanding how the CodeDom reflection optimizer improved private field level access, now it turns out that it did not. An updated chart has been posted at the bottom of this article *
Recently I was involved in a discussion on the NHibernate forums regarding how to implement the null object pattern which later moved to a discussion regarding the performance impact of such a pattern. I have been told many things regarding the performance impact of reflection and more specifically the performance impact of reflection access of a property versus a field, but I have never actually researched these items myself. I finally took the time to closer examine the impact of NHibernate access mechanisms, and the results really surprised me!
I've always been told that accessing private members is significantly slower than accessing public members (due to Code Access Security checks). Because of this I used to prefer to access properties instead of private fields. However, after noticing that even with a very large application the reflection impact of field access versus property access wasn't noticeable I shifted gears to believe that all items which should not be settable from code should rely on NHibernate's nosetter access mechanism. Basically why expose a setter for your object's Id property when it is an identity column that only NHibernate should ever populate? This makes our code safer and helps ensure that someone who is new to NHibernate does not try to set that id value due to a misunderstanding of how this new O/RM paradigm works.
Test Overview
I figured it was about time to do some actual tests which showed the impact of using public property reflection versus private field reflection. I decided to write a small .NET console application which would include a simple type which contained a single private field and a single public property which wrapped that field. See the below class:
public class SampleObject
{
private int val;
public int Val
{
get
{
return val;
}
set
{
val = value;
}
}
}
I then chose the mechanism I would use to measure the relative performance. I decided on using threading to allow both property access and field access to run at exactly the same time. I figured that I did not want my test results to be impacted by unknown differences in system environments while the tests were running. I figured if both tests are running at exactly the same time both accessors would be subject to exactly the same system constraints. I determined that I would start two threads, and then immediately call Join on both threads from the main application thread which would allow me to evaluate the differences. Note that I calculate the processing time within each thread to ensure the time is as accurate as possible.
Now that I know how I will compare my field access versus property access I had to determine the metrics I wanted to measure. NHibernate 1.2 provides a few options regarding how it should access or set the values of fields and properties. The tool will always use reflection, but there are actually multiple ways NHibernate can use reflection.
1) Basic Getter/Setter
NHibernate's first and most basic mechanism for accessing/setting property values is via the NHibernate.Property.BasicGetter and NHibernate.Property.BasicSetter classes. Basically these classes wrap up a System.Reflection.PropertyInfo instance and then use that PropertyInfo's GetValue and SetValue methods to get or set the value via reflection. This is probably the mechanism most developers have used to get/set values via reflection (if they have used reflection).
2) Field Getter/Setter
NHibernate offers NHibernate.Property.FieldGetter and NHibernate.Property.FieldSetter to provide the basic getter/setter functionality to fields as well as properties. This works the same as above but uses the System.Reflection.FieldInfo class instead of the PropertyInfo class.
3) CodeDom Reflection Optimizer
Due to the performance impact of using reflection to get/set values NHibernate introduced the concept of "reflection optimizers". Basically NHibernate will build a custom class which it will use to access the field/property value of your object and then access the value via a delegate to this newly created accessor method to allow NHibernate to bypass reflection for each access.
The CodeDom mechanism of reflection optimization is used to support .NET 1.1. NHibernate writes it's own C# code in code which wraps up your property/field access and then runs this dynamically generated C# code through the built in System.CodeDom.Compiler.CodeDomProvider class or more specifically the Microsoft.CSharp.CSharpCodeProvider class which is used to generate a C# code compiler and then compile the dynamically written C# code. After compiling NHibernate uses reflection to dynamically create an instance of the new class and then proceeds to use that class for it's access.
4) Lightweight Reflection Optimizer
this technique of reflection optimization is very similar to the above mentioned CodeDom optimization except that it does not require a C# compiler. I believe it is called lightweight since it does not incur the overhead of the compiler. Instead this technique relies on the System.Reflection.Emit.ILGenerator class to dynamically build the accessor class. This basically skips the compiler and provides the output which the above compiler would provide.
Testing Code
For all tests I used the NHibernate types IGetter, ISetter and IReflectionOptimizer to ensure that my code followed the exact same code path that users of NHibernate can expect.
Now that I have my testing technique
For my testing loops I used the following code:
public void TestGet()
{
object value;
DateTime begin = DateTime.Now;
for (int i = 0; i < NUM_LOOPS; i++)
{
value = getter.Get(obj);
}
Time = DateTime.Now - begin;
}
public void TestSet()
{
object value;
DateTime begin = DateTime.Now;
for (int i = 0; i < NUM_LOOPS; i++)
{
setter.Set(i);
}
Time = DateTime.Now - begin;
}
And then to run the actual tests I needed the following code:
Thread propertyThread = new Thread(propertyContainer.TestGet);
Thread fieldThread = new Thread(fieldContainer.TestGet);
propertyThread.Start();
fieldThread.Start();
propertyThread.Join();
fieldThread.Join();
The prior loop is written for each case passing in the appropriate Getter/Setter.
After each run the results are output to the console window for me to analyze.
Test Results
Now for the good stuff! How did the results turn out? First for reference I ran this test on a slightly outdated computer (Athlon XP 3700+ 2GB Ram Vista Ultimate) with release code. The results I found were not at all what I expected. Note that for each scenario I ran the methods through 100,000 loops so the times shown are the time it takes (in milliseconds) to get/set a field/property value 100,000 times and with two simultaneous threads. Each bar graph pair was run simultaneously. See the graph:
Property (msecs) | Field (msecs) | |
---|---|---|
Basic Getter | 524.38 | 391.58 |
Basic Setter | 671.83 | 505.83 |
Lightweight Getter | 13.67 | 27.34 |
Lightweight Setter | 29.30 | 15.62 |
CodeDom Getter | 15.62 | 25.39 |
CodeDom Setter | 35.15 | 13.67 |
Now, the first thing that jumps out at you is that the Reflection Optimizer strategies are significantly faster than the non optimized techniques. Well duh, it has optimize in the name right? That was to be expected. I don't know that I thought it would make this much of an impact, I expected something more along the lines of 3 times faster, but not 38 times faster! This part was really encouraging.
The part that threw me for a loop was that when using basic reflection the private field access was faster than the public property access? I never would have guessed that. Haven't we all been hearing about how slow private field access is for a long time?
Given this I can't possibly see why someone would avoid using field level access. With this sort of performance it seems to provide the cleanest access mechanism to hydrate your objects and then to analyze and save the changes. From now there is no way I'll put a setter on a property where it does not make sense from a public API perspective. I feel like the results of these test lifted some chains off my shoulders. (OK, very light chains made from plastic, but chains none the less!)
Is anyone else surprised by these results? I find them encouraging, but if I was making predictions before running the tests, this is not what I would have expected.
If anyone would like the full source code (it's about 210 lines of code total) leave me a comment and I will e-mail it to you.
--John Chapman
*11/19/2007 Update - Below is an updated chart that includes the correct data. It turns out that a bug in the code caused the CodeDom field getter/setter to use the property getter/setter instead. The results now make a lot more sense. I'm sorry for any confusion.
Note that the tests had to be re-ran and hence the numbers came out slightly differently this time. Also note that the items had to be abbreviated (LW = LightWeight and CD = CodeDom). Take note that the CodeDom optimizer does nothing for field level access, which is what I originally expected (since after reviewing the code I saw that for non Basic Getter/Setters it just calls the provided IGetter/ISetter.
With this run I also added a direct property access getter and setter as a base line. This helps to show the true performance of using reflection (or the reflection optimizers of NHibernate). Note that no direct value is provided for the field since direct access of a private field is not possible.
*For Updated Results with 10,000,000 Loops see follow up post: NHibernate Access Performance Revisited.
--John Chapman
Posted by John Chapman at 11:55 AM 7 comments
Labels: C#, NHibernate
Saturday, November 10, 2007
ASP.NET Page.IsPostBack: Who Started It?
I've seen it in many applications. I've even been guilty of it in the past. But how did it get started? I'm talking about the applications that look like the following:
protected void Page_Load(object sender, EventArgs e)
{
if (!Page.IsPostBack)
{
}
}
Why in the world do we do this? Developers that check if (Page.IsValid) are just as guilty. If you haven't figured it out I'm talking about using the Page property of the ASP.NET Page class.
If you're not familiar with the ASP.NET framework, you have System.Web.UI.Control which is the cornerstone of all visual aspects in ASP.NET. All visual elements inherit from that base class, including the ASP.NET page itself. The control class has a Page property which references the System.Web.UI.Page class which contains the control. In the case of the Page object, the Page property just returns itself.
Whenever we use Page.IsValid or Page.IsPostBack on our page we are saying "give me the page of this page and then check if it is valid" when we could just say IsValid or IsPostBack since then we would say is this page valid?
This is not something I've only seen once or twice, I've seen it on almost every ASP.NET project I've worked on. I have seen every developer I have ever worked with do this. But why would so many people choose to use Page.IsPostBack instead of just IsPostBack? I could understand one subset of developers doing this but it almost seems universal.
Out of curiosity I broke out Professional ASP.NET which is the oldest ASP.NET reference book I had available to me (Publshed June 2001) and I took a look at the IsPostBack section. Sure enough everywhere they gave sample code, it was Page.IsPostBack! Are these the guys to blame for this? Does anyone have a better reference that shows this behavior? I don't have access to the old MSDN documentation from the 1.0 days.
Next time we come across the Page.IsPostBack why not change it to Page.Page.IsPostBack or even Page.Page.Page..IsPostBack? If that's too confusing maybe we could try this.Page.Page.IsPostBack. There are endless possibilities here.
How did this happen? Am I off base here? Is this not as wide spread as I seem to think it is? Is it just my bad luck in the experiences I have had? Have people noticed this on other projects as well? Please tell me it is not just me!
--John Chapman
Posted by John Chapman at 12:45 PM 4 comments
Labels: asp.net
Tuesday, November 6, 2007
VSTS Web Test Error Found
I previously mentioned that I was intrigued by the Visual Studio 2005 Web Test feature, but was having issues with actually running my tests on an application which uses Forms Authentication (See the prior post: Visual Studio 2005 Team Edition Web Test)
Well it turns out I found my answer, and I don't like it. I found a knowledge base on MSDN which describes my issue (http://support.microsoft.com/kb/936005). It turns out that Forms Authentication doesn't work at all! This is after I saw other places that seemed to indicate it would work flawlessly.
How in the world does this make it out the door? They say they have a hotfix for it (which you have to go through holy hell to actually get btw), but why is it a hotfix and not at least something which was fixed by SP1? People, Visual Studio 2005 has been out for over 2 years, and this basic piece of functionality still doesn't work? How is that acceptable?
The quote in the "Cause" section of the knowledge base is priceless.
This problem occurs because the cookies in Visual Studio 2005 Team System use the Request for Comments (RFC) specification. However, some Web browsers do not use the RFC specification. For example, Internet Explorer does not use the RFC specification.
I would have expected Internet Explorer to have been the one browser Microsoft actually tested with. How would Microsoft put out a tool that doesn't properly support its own browser?
I'm not sure if I'll just wait for Visual Studio 2008 to see if that resolves my issues (it is due for release this month after all) or if I should look for a way to run my tests without using Forms Authentication.
--John Chapman
Posted by John Chapman at 6:18 PM 1 comments
Labels: asp.net, Testing, Visual Studio 2005
Saturday, November 3, 2007
Visual Studio 2005 Team Edition Web Test
Recently I have become very interested in Visual Studio 2005 Team Edition's Web Test testing feature. I'm a firm believer in unit testing and we use them heavily at work. However, for front end testing we've always relied on hand testing. There are regression tests with go with every release, but unfortunately for us due to the size of the system only a random portion of those tests are ran for each release.
We've been interested in automated front end testing tools for a long time. I thought the Web Test would be perfect for what we were looking for. Unfortunately this is not my current top priority and only a small amount of time has been spent researching how to use the tool, but so far no go!
We use forms authentication for our application. I'm able to record a basic test which involves logging in to the application, but when the test is played back it doesn't work! The user is never authenticated. I verified the user name and password in the coded test. Everything looks right, yet it doesn't work! The fun part of it is that after logging in the test is supposed to go to a new page in the system, but since the user has not been authenticated the system returns a page stating that the user is not authorized to access the page. The built in test checks call that a success! It passes the test because it received a 200 response. I know I can add my own validation, but I thought it would have at least checked the URL to see if it matched the recording.
So while I was able to find walk through documentation from Microsoft nothing explained forms authentication. Does anyone reading this have experience testing forms authentication applications? I thought it was just supposed to pass the user/password to the form and be done with it. Is there something I'm missing.
On the plus side I am looking forward to the web test enhancements which are built in to Visual Studio 2008. I'm actually pretty exited about the pending release. I believe it should be available any day to MSDN subscribers. I enjoy feeling like a kid again every time a new version of Visual Studio is being released!
If I'm able to resolve my issues reasonably soon you can rest assured a new blog post will be made describing how I resolved it. Wish me luck!
--John Chapman
Posted by John Chapman at 1:08 PM 2 comments
Labels: Visual Studio 2005