Partial Methods
Yesterday we talked about Partial Classes; a technique where you can place a portion of a type definition in multiple locations, pulled together at compile time. Paulo reminds us that not only can you have different aspects of a type defined in multiple files, you can also split a method placing part of it in one file, and another part in another file. He shares the MSDN documentation found at (http://msdn.microsoft.com/en-us/library/wa80x488.aspx).
One use of partial methods is to provide a developer team with the ability to define a method by one group, and have another group provide the method implementation. This is not the same thing as an interface. It is intended to be a physical implementation definition in one file, and the full implementation in another.
Using partial methods allows a consumer team to create the partial method implementation prototype, and build code around the prototype without compiler warnings. When the code from the implementation team is added as a partial method definition, the code is brought together and may be used as if it was all written at one time by one team.
Balu also uses partial classes. He writes:
We use Partial classes very extensively in our project where we do have lot of auto-generated code. Our application is Smart Client and we use SCSF/CAB patterns.
SCSF architecture requires having following components
- Server Side business objects
- Client Side business objects
- Translators that translates server side BOs to client side BOs and vice versa.
At the same time, on the client side BOs, we add custom properties, additional methods, validation methods, etc. in partial class so that auto-generated code remain intact.
Thanks for all the comments sent to btaylor@sswug.org.
Cheers,
Ben
$$SWYNK$$
SSWUGtv
With Stephen Wynkoop
How do you design an above-average performance review goal? Laura Rose has some fantastic ideas on today’s show.
Watch the Show
Featured Article(s)
Data Mining – Part 1
The abundance of data, coupled with the need for powerful data analysis tools, has been described as a data rich but information poor situation. The fast-growing, tremendous amounts of data, collected and stored in large and numerous databases, have far exceeded our human ability for comprehension without powerful tools. Due to the high cost of data collection, people learned to make decisions based on limited information. But this was not possible if the data set is huge or very big. Hence concept of data mining was developed.
Featured White Paper(s)
Go Beyond SCOM Monitoring with Foglight for SQL Server
read more)
Featured Script
dba3_Login_User_Role_SecurityProcedures
Some built in security procedures useful for login, user, role, and object permissions auditing purposes… (read more)