There appears to be a bug in how Visual Studio does "Add Service Reference" proxy generation for VB.NET projects.
I ran into this issue when trying to add a service reference to Bing Maps and I believe it is not limited to that. I think it stems from it misunderstanding that enum instance variables cannot be tested for equality using .Equals in VB... not sure why.
I think the solution is to simply replace the offending line:
If Me.enumfield.Equals(value) <> true Then
If Not Object.Equals(Me.enumfield ,Value) Then
BTW: DO NOT compare boolean results to True
as they have done!
If boolean_results Then
If Not boolean_results Then
Specifically, this block of code is wrong:
Public Property CompareOperator() As SearchService.CompareOperator
If (Me.CompareOperatorField.Equals(value) <> true) Then
Me.CompareOperatorField = value
and should be replaced with this:
Public Property CompareOperator() As SearchService.CompareOperator
If Not Object.Equals(Me.CompareOperatorField,value) Then
Me.CompareOperatorField = value
Made this edit so search engines would pick it up... so I can find my own fix next time I run into it. :) [/EDIT]
It is still broken in VS2010 and will break your fixed project when upgrading. Gosh, thanks.
I'm actually going to encourage you to consider using GC methods. Well, now hang on cowboys... this is for pretty limited circumstances.
You see, the GC class can do more than compact memory. It can also run finalizers on unreferenced objects. If you are writing a class that included a finalizer (which you would only need if your class is the lowest-level class to deal with an unmanaged resource or are managing a pool of scarce resources) then you might consider using GC. Even though you might (read should) implement IDisposable, if a user of your library fails to Dispose your object, that unmanaged resource will leak until, at some non-deterministic (read random) point, the garbage collector runs and that object's finalizer might (read random) be queued and run. If it leaks often enough, your attempt to create additional objects with the unmanaged resource could fail because you have exhausted the available supply.
Rather than fail, you could attempt:
and try again. If it works on the retry, you might want to log that for the developer's information. One might argue that this is just encouraging and forgiving sloppiness, but I look at it as a step towards robustness and can actually help the developer diagnose a real problem whereas not doing this would just lead to more random run-time only-on-heavy-load failures.
If you wanted to be really awesome, once you detect that you could recover, flip a bit on to start recording a stacktrace on each create, then in your finalizer, put that stacktrace in a application-wide queue that then gets logged by the next create. Now you've told the dev exactly where to go find the spot where they got your object but didn't dispose it.
Another place is when you are calling something that has an unmanaged resource behind it along with other code that isn't behaving. For example, you are writing a web part in SharePoint that is calling new SPSite and OpenWeb... perhaps you are being a good SPRequest citizen, but another web part on the site that you didn't write isn't being so nice.
You could make auto-recovery wrappers for the stuff you use. Your newSPSite method would attempt to return new SPSite but if that should fail, Collect + WFPF and try again. Ugh, what a pain. But sometimes it beats trying to explain to someone why it is another webpart's fault that your webpart explodes.
SharePoint has a great API for most things, but some of things seem a bit lacking.
I recently had a need to auto-create folders inside a document library. Basically, like mkdir with command extensions does on the command line. If directory x exists, but x\y doesn't, mkdir x\y\z will mkdir x\y and mkdir x\y\z.
After some struggling, here's what I ended up with: Public Shared Sub
url As String
New SPSite(url) Using
SPWeb = site.OpenWeb() If
url.StartsWith(web.Url) Then Dim
SPFolder = web.RootFolder For Each
url.Substring(web.Url.Length + 1).Split("/"
SPFolder = web.GetFolder(folder.Url & IIf(folder.Url.Length > 0, "/"
) & segment) If Not
folder = folder.SubFolders.Add(nextfolder.Name) Else
folder = nextfolder End If
Okay, I know it doesn't look like much, but that code represents some hard won knowledge. The main thing was dealing with folders that had spaces in them. Before, I was System.Web.HttpUtility.UrlDecode()ing them, but my use of .Name on a .Exists=false proved to be much more elegant. Even though I'm doing a GetFolder on a escaped url, the .Name property returns a nice, suitable for passing into SubFolders.Add(), de%20ed name. But why use this ugly indirect path string building technique? Well, I could not find an exception-free way of doing existence testing of a folder other than web.GetFolder() which, of course, needs a web relative url -- this lead to the more clunky looking GetFolder expression.
For the two people that read this blog (me and uh, that might be an exagerated figure) the reason my snippets have been in VB.NET lately is because I think VB.NET is underloved in the SharePoint community. VSEWSS, for example, is a C# only club. Heck with that. It doesn't provide enough juice to justify the opaqueness of its wsp builder. VB.NET has some nice features and I like taking advantage of them -- I know both cold and I make no apologies for choosing VB.NET. If you are one of those C# 1337ists... run the benchmarks and tell me its so much better. Tell me what it does so much better that it makes up for the utter lack of exception filters, XML literals (VB9), pleasant event raising, beyond 1980s-era switch/select case, optional parameters, automatic by-reference parameters for callers, array resizing, and procedure scoped static vars. I'm not saying C# is worse -- it's like the difference between ibuprofen and acetaminophen. Minor advantages in some edge cases both ways but they all fix most headaches.
I've seen some comparisons between the two languages before on the net, but none seemed entirely complete. If I ever am put in jail with nothing but a toilet and a laptop for a few years, perhaps I'd blog /the/ definitive list.
For those of you thinking of writing in your reasons why C# is soooo superior to VB.NET, please refer to the Logical Fallacy article on Wikipedia
before embarrassing yourself. Reasons like "real programmers use semicolons" or "only a moron would use a language with the word 'basic' in the name" will be ridiculed mercilessly. There are some valid arguments for C# and one could make an equally compelling valid case for it over VB.NET.
Everyone knows exception handling is relatively expensive, but it is important to keep in mind where the cost lies. It is primarily in the throw action. The price of a try is nearly zero when no exceptions occur.
In the IL instruction stream the only difference between a wrapped block of code and plain is a LEAVE instruction that jumps around the catch block. The method has an exception table that marks the region of code protected. What happens when we actually time the difference? The result is extremely small -- and strangely enough, try/catch in the loop is slightly faster for C#! Because the numbers we are dealing with are so small, however, I would suggest that this is coincidental noise due to instruction alignment or something like that. I would guess that the cost of try is essentially crushed out by the JIT compiler optimizer.
The real expensive operation is throwing exceptions. How expensive? My tests show it's about 36 microseconds each. Slow by comparison, but maybe not as slow as you thought. As expected VB's exception handling was a little bit slower (~1%), due no doubt to SetProjectError calls that are inserted before throws/rethrows.
I always had an assumption that THROW (rethrow) was cheaper than THROW ex or THROW NEW. I reasoned that the stackframe would not have to be gathered during rethrow. Timing it shows rethrow is slightly slower.
I learned you have to be pretty careful about timing code. For example, a significant amount of time is burned JITing when a function is hit the first time, and that first exceptions are really expensive. Firsts should be burned off before timing begins.
Try out the code attached and post your results and please give environment.
Here's my numbers on a Dell Latitude D830 1.7GHz 64-bit laptop w/ 4GB RAM.
>CSharp.exe 10000000000 100000
Loop without try/catch: 31132 milliseconds
Loop with try/catch: 21251 milliseconds
difference: -9881 milliseconds
throw new with 5 throw ex: 21448 milliseconds
throw new with 5 rethrows: 21370 milliseconds
difference: 78 milliseconds
>VB.exe 10000000000 100000
Loop without try/catch: 23361 milliseconds
Loop with try/catch: 30739 milliseconds
difference: 7378 milliseconds
throw new with 5 throw ex: 21617 milliseconds
throw new with 5 rethrows: 21845 milliseconds
difference: -228 milliseconds
Attached: TestExceptionHandlingInLoop.rar (12.5 KB)
Last month I was working with some code that used System.Xml.Xsl.XslCompiledTransform. I knew it involved compiling stuff (where did I get that idea?) and making some IL dynamically to run. Neat-o. This month on my road to getting MCPD by month-end, I was reading about RegEx. It didn't really dawn on me before that it could compile the expression vs. interpret it. Pretty cool. But I was always skeptical about these because I had heard that you can Assembly.Load, but there's no such thing as Assembly.Unload... so I thought that this meant these would be memory leaks... but, no. I read (http://blogs.msdn.com/joelpob/archive/2004/04/01/105862.aspx) about something called LGC (lightweight code-gen) and DynamicMethod. I was reading this (http://blogs.msdn.com/bclteam/archive/2004/11/12/256783.aspx) and read that memory leak saga about regular expressions but "We've fixed that problem in Whidbey" caught my attention. They did? Yes, DynamicMethod does by making it so code can be emitted and executed from the managed heap - very cool. I had to know if XslCompiledTransform used this, so I reflected across Microsoft.*.dll and System.*.dll to see what uses DynamicMethod... Yes, XslCompiledTransform does. Nice - but I'm not sure it uses that exclusively. Given it actually spits out .dll files, I have to assume it Assembly.Loads and executes them... We avoided any potential leaks in a web app by holding the tranforms in the Application dictionary. This also made the whole thing faster since each transform was only compiled once.
By the way, I would encourage .NET devs out there to look in to getting certified (MCTS or MCPD). Don't be a wuss and shape-test your way through. (Shape-test refers to the process of getting test guides that so exactly mirror the real test that you can just memorize the shape of the correct answer to ace it. - I bet if given those kind of study guides written in Thai, I could pass the test. That's bull-khrap.)
Well, after much procrastination, I finally spent some quality time with the GoDaddy hosting control center and various dasBlog instructions and got it working.
For anyone trying the same...
- From GoDaddy, go to "My Hosting Account".
- Click the [open] link under "Control Panel" for the domain.
- Under settings, select ASP.NET runtime and make sure it's 2.0.
- Under settings, check FrontPage extensions and make sure they are NOT installed.
- Select "Directory Management" under "Content".
- Create a custom directory where you'll host the blog with "Read", "Web" and "Set Root".
- Create the subdirectories SiteConfig and logs and give them each read+write access.
- Create another one called content and give it read+write+web access.
- Download the dasBlog webfiles zip and extract it locally.
- Edit your siteconfig/site.config setting the obvious looking stuff.
- Edit your /web.config to get rid of or comment out the line
<trust level="Medium" originUrl="" /> (a GoDaddy-ism)
- Copy all the extracted files to your blog directory (using explorer view on ftp:// works)
- Point your browser at your blog.
- Sign in using "admin", "admin" (unless you changed the user name in SiteConfig/siteSecurity.config before you uploaded it)
- Immediately change that password.
** Disclaimer: I didn't try those steps exactly so they may not be entirely perfect. **
Edit: Okay, I just setup another blog using those instructions and they seemed to work nicely.
Look here for more help on this.
Now, to blog from Word 2007 with images… create from the blog post template, set up an account of type "Other" and set the Url to http://domain/blogdir/blogger.aspx, set the picture option to "my blog provider". Publish! It should automatically upload pictures to /content/binary/.
Okay, so what is freachable? It is purgatory for .NET objects with finalizers. During garbage collection, objects that have finalizers need to have their finalizers run, but the GC shouldn't have to wait for them (after all the entire process is stalled while GC runs and finalizer code could affect what the GC is doing), but if there are no references to keep the object alive, it will get collected so the object is made to reachable by the finalizer thread by placing it in its special queue called freachable. GC completes and the finalizer thread wakes up, executes the finalizers in the freachable queue, and clears the queue entry, which is the last remaining reference to the object and now the next time GC runs, the object's memory will be reclaimed.
Read all about GC at http://msdn.microsoft.com/msdnmag/issues/1100/gci/
According to that article, it is pronounced f-reachable. That just sounds l-ame.
So what's the point of this blog? To blog stuff I figure out. I will try not to waste bandwidth on "hey, go read what this other guy figured out". Okay, so most of this post is stuff that someone else figured out, but the web.config mod on dasBlog was not in Aakash's notes.