Please note that this site has moved over to https://schottsql.com. Please follow me over there for the most recent posts and updates.
Friday, September 8, 2023
New Site Home
Thursday, April 21, 2016
Getting All Permissions for Server
We recently had to work out a way to fetch and display all effective permissions granted to our Production SQL Server. I had been required to get all permissions at the DB level before, but even then that tended to be more granular and only showed to what roles you belonged and what explicit permissions you'd been granted. We wanted to run something through a loop of all users and get that information into a report. It turns out this was easier than I'd expected, especially after my first attempts to dig through the system tables.
We used a script something like the following:
EXECUTE as login = 'DomainName\Login';
select 'DomainName\Login' AS Login, * from fn_my_permissions (NULL, 'SERVER');
REVERT;
With a little work this can be used within PowerShell or SQL to loop through all logins and capture the information in a table or report.
The function could also be used to loop through all databases by using the database and then substituting Database for Server above:
USE AdventureWorks2012;
EXECUTE as login = 'DomainName\Login';
SELECT 'DomainName\Login', * FROM fn_my_permissions (NULL, 'DATABASE');
REVERT;
For an interesting take on revamping sp_msforeachdb, check out this script from Edward Pollack (t). It might help you if you need to loop through all of your databases to run something like this for your auditors.
Monday, February 29, 2016
SQL Server Audits and Action_IDs
We were recently re-doing our SQL Server Audits and I was reminded again how painful setting the filters can be. MS expects an integer for "action_id", but to actually use, them you need to know what those actually mean.
I came across this blog post by Chris Provolt listing the text versions of the action_id's. That was helpful, especially the quick query to see what's available:
Select DISTINCT action_id,name,class_desc,parent_class_desc from sys.dm_audit_actions
However, as you can tell by running this, the action_id's returned are all text values. That doesn't help when trying to set up your SQL Audits.
MSDN provides code for a function to translate the text audit_id into the expected numeric value.
CREATE FUNCTION dbo.Getint_action_id (@action_id VARCHAR(4))returns INT
BEGIN
DECLARE @x INT;
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 1, 1))));
IF Len(@action_id) >= 2
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 2, 1))
))
* Power(
2, 8) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ' ')) * Power(2, 8) + @x;
IF Len(@action_id) >= 3
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 3, 1))
))
* Power(
2, 16) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ' ')) * Power(2, 16) + @x;
IF Len(@action_id) >= 4
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), Upper(
Substring(@action_id, 4, 1))
))
* Power(
2, 24) + @x;
ELSE
SET @x = CONVERT(INT, CONVERT(VARBINARY(1), ' ')) * Power(2, 24) + @x;
RETURN @x;
END;
Once you create that function, you can use it to generate a list of the Integer action_ids, the text action_ids, and the name/description of those action_ids. (You can also expand it out to see which actions are applicable to various objects at the server and database level if you so desire.)
SELECT DISTINCT dbo.Getint_action_id(action_id) Action_ID_Int,action_id, NAME AS Action_Description
--,class_desc,parent_class_desc
FROM sys.dm_audit_actions
ORDER BY action_id
This will result in the following values (as of SQL 2012):
Action_ID_Int | action_id | NAME |
---|---|---|
1329873729 | ACDO | DATABASE_OBJECT_ACCESS_GROUP |
542065473 | ACO | SCHEMA_OBJECT_ACCESS_GROUP |
1329742913 | ADBO | BULK ADMIN |
1346651201 | ADDP | DATABASE_ROLE_MEMBER_CHANGE_GROUP |
1347634241 | ADSP | SERVER_ROLE_MEMBER_CHANGE_GROUP |
538987585 | AL | ALTER |
1313033281 | ALCN | ALTER CONNECTION |
1397902401 | ALRS | ALTER RESOURCES |
1397967937 | ALSS | ALTER SERVER STATE |
1414745153 | ALST | ALTER SETTINGS |
1381256257 | ALTR | ALTER TRACE |
1280462913 | APRL | ADD MEMBER |
538989377 | AS | ACCESS |
1129534785 | AUSC | AUDIT SESSION CHANGED |
1179866433 | AUSF | AUDIT SHUTDOWN ON FAILURE |
1213486401 | AUTH | AUTHENTICATE |
538984770 | BA | BACKUP |
541868354 | BAL | BACKUP LOG |
1111773762 | BRDB | BACKUP_RESTORE_GROUP |
1179595331 | C2OF | TRACE AUDIT C2OFF |
1313813059 | C2ON | TRACE AUDIT C2ON |
1196180291 | CCLG | CHANGE LOGIN CREDENTIAL |
1196182851 | CMLG | CREDENTIAL MAP TO LOGIN |
1430343235 | CNAU | AUDIT_CHANGE_GROUP |
538988355 | CO | CONNECT |
538988611 | CP | CHECKPOINT |
538989123 | CR | CREATE |
538976324 | D | DENY |
1179074884 | DAGF | FAILED_DATABASE_AUTHENTICATION_GROUP |
1279738180 | DAGL | DATABASE_LOGOUT_GROUP |
1397178692 | DAGS | SUCCESSFUL_DATABASE_AUTHENTICATION_GROUP |
1178681924 | DBAF | DATABASE AUTHENTICATION FAILED |
1396785732 | DBAS | DATABASE AUTHENTICATION SUCCEEDED |
1128481348 | DBCC | DBCC |
1195590212 | DBCG | DBCC_GROUP |
541868612 | DBL | DATABASE LOGOUT |
538987588 | DL | DELETE |
1280462916 | DPRL | DROP MEMBER |
538989124 | DR | DROP |
541284164 | DWC | DENY WITH CASCADE |
538990661 | EX | EXECUTE |
538989638 | FT | FULLTEXT |
541545542 | FTG | FULLTEXT_GROUP |
538976327 | G | GRANT |
1111773767 | GRDB | DATABASE_PERMISSION_CHANGE_GROUP |
1329877575 | GRDO | DATABASE_OBJECT_PERMISSION_CHANGE_GROUP |
542069319 | GRO | SCHEMA_OBJECT_PERMISSION_CHANGE_GROUP |
1330860615 | GRSO | SERVER_OBJECT_PERMISSION_CHANGE_GROUP |
1448301127 | GRSV | SERVER_PERMISSION_CHANGE_GROUP |
541546311 | GWG | GRANT WITH GRANT |
1346653513 | IMDP | DATABASE_PRINCIPAL_IMPERSONATION_GROUP |
542133577 | IMP | IMPERSONATE |
1347636553 | IMSP | SERVER_PRINCIPAL_IMPERSONATION_GROUP |
538988105 | IN | INSERT |
541214540 | LGB | BROKER LOGIN |
1195525964 | LGBG | BROKER_LOGIN_GROUP |
1094993740 | LGDA | DISABLE |
1111770956 | LGDB | CHANGE DEFAULT DATABASE |
1095059276 | LGEA | ENABLE |
1279674188 | LGFL | FAILED_LOGIN_GROUP |
1179207500 | LGIF | LOGIN FAILED |
1397311308 | LGIS | LOGIN SUCCEEDED |
1196181324 | LGLG | CHANGE DEFAULT LANGUAGE |
541935436 | LGM | DATABASE MIRRORING LOGIN |
1196246860 | LGMG | DATABASE_MIRRORING_LOGIN_GROUP |
1296975692 | LGNM | NAME CHANGE |
542066508 | LGO | LOGOUT |
1146308428 | LGSD | SUCCESSFUL_LOGIN_GROUP |
538988364 | LO | LOGOUT_GROUP |
1111772749 | MNDB | DATABASE_CHANGE_GROUP |
1329876557 | MNDO | DATABASE_OBJECT_CHANGE_GROUP |
1346653773 | MNDP | DATABASE_PRINCIPAL_CHANGE_GROUP |
542068301 | MNO | SCHEMA_OBJECT_CHANGE_GROUP |
1330859597 | MNSO | SERVER_OBJECT_CHANGE_GROUP |
1347636813 | MNSP | SERVER_PRINCIPAL_CHANGE_GROUP |
1196182862 | NMLG | NO CREDENTIAL MAP TO LOGIN |
538988623 | OP | OPEN |
1111773263 | OPDB | DATABASE_OPERATION_GROUP |
1448300623 | OPSV | SERVER_OPERATION_GROUP |
1380013904 | PWAR | APPLICATION_ROLE_CHANGE_PASSWORD_GROUP |
541284176 | PWC | CHANGE PASSWORD |
1195595600 | PWCG | LOGIN_CHANGE_PASSWORD_GROUP |
1396922192 | PWCS | CHANGE OWN PASSWORD |
1480939344 | PWEX | PASSWORD EXPIRATION |
1129142096 | PWMC | MUST CHANGE PASSWORD |
1280333648 | PWPL | PASSWORD POLICY |
542267216 | PWR | RESET PASSWORD |
1397905232 | PWRS | RESET OWN PASSWORD |
542463824 | PWU | UNLOCK ACCOUNT |
538976338 | R | REVOKE |
538985298 | RC | RECEIVE |
538986066 | RF | REFERENCES |
538989394 | RS | RESTORE |
541284178 | RWC | REVOKE WITH CASCADE |
541546322 | RWG | REVOKE WITH GRANT |
538987603 | SL | SELECT |
538988115 | SN | SEND |
1313624147 | SPLN | SHOW PLAN |
1448301651 | STSV | SERVER_STATE_CHANGE_GROUP |
1313953107 | SUQN | SUBSCRIBE QUERY NOTIFICATION |
1313035859 | SVCN | SERVER CONTINUE |
1146115667 | SVPD | SERVER PAUSED |
1146312275 | SVSD | SERVER SHUTDOWN |
1381193299 | SVSR | SERVER STARTED |
1095975252 | TASA | TRACE AUDIT START |
1347633492 | TASP | TRACE AUDIT STOP |
538988372 | TO | TAKE OWNERSHIP |
1111773012 | TODB | DATABASE_OWNERSHIP_CHANGE_GROUP |
1329876820 | TODO | DATABASE_OBJECT_OWNERSHIP_CHANGE_GROUP |
542068564 | TOO | SCHEMA_OBJECT_OWNERSHIP_CHANGE_GROUP |
1330859860 | TOSO | SERVER_OBJECT_OWNERSHIP_CHANGE_GROUP |
1195594324 | TRCG | TRACE_CHANGE_GROUP |
542069332 | TRO | TRANSFER |
1346847573 | UCGP | USER_CHANGE_PASSWORD_GROUP |
1195459669 | UDAG | USER_DEFINED_AUDIT_GROUP |
1430340693 | UDAU | USER DEFINED AUDIT |
538988629 | UP | UPDATE |
1178686293 | USAF | CHANGE USERS LOGIN AUTO |
1196184405 | USLG | CHANGE USERS LOGIN |
1129599829 | USTC | COPY PASSWORD |
1414743126 | VDST | VIEW DATABASE STATE |
1414746966 | VSST | VIEW SERVER STATE |
1413699414 | VWCT | VIEW CHANGETRACKING |
538984792 | XA | EXTERNAL ACCESS ASSEMBLY |
538989912 | XU | UNSAFE ASSEMBLY |
Saturday, August 15, 2015
Database Lifecycle Management - Dacpacs
https://www.simple-talk.com/sql/database-delivery/microsoft-and-database-lifecycle-management-dlm-the-dacpac/
The PowerShell scripts are well worth checking out and there are several interesting calls for SQLPackage along the way. I appreciated his findings with some of the issues - not always seeing changes that should be detected. I suspect that there's a setting not configured properly there, but couldn't say for sure. I know that I haven't hit any issues when using SQLPackage and dacpacs for normal DB development of the sort discovered at the end of the article.
The conclusion is spot on, whatever solution you use - check your scripts, test, and verify before just putting the solution into production.
Monday, July 20, 2015
Pre-deploy SQL Scripts before checking for changes
Here's the main article.
The general idea is that you add a SQL Script to your project with:
Build Action = "None"
Copy To Directory = "Copy Always" (or "Copy if Newer")
When you set up your publish actions, look for this script first, run it in PowerShell, SQLCMD, or whatever other option you might have, then run your SQLPackage commands as normal. It will run through whatever scripts you want to run first, then compare against the modified database to bring it up to date.
The idea is pretty clever, though I haven't come across a lot of need for it when trying to deploy DB changes. I'm not quite comfortable with a script running to change the state of the database to something unknown prior to running the script to bring it in line with version control. I suppose it has some use-cases, though, as I've come across this request several times on Stack Overflow.
Friday, November 7, 2014
SSAS Tabular and Excel PowerPivot
I don’t know why it took so long before making the connection between PowerPivot models and SSAS Tabular models, but in the last couple of days I came across several blogs and videos tying the two together. With that in mind, I took the plunge into PowerPivot and built a quick model off of some SQL queries to test out a concept.
I started with the basics of mocking up a fact table. I stripped out decorator text, “renamed” a couple of columns in the query, filtered the data so I wouldn’t pull back a crazy amount into my local workstation, verified the counts and look of the data in a SQL query, and added that. Excel quickly loaded 1.2M rows and didn’t blink.
After that, I knew I needed a couple of Dimension-like tables. I wrote queries for those in a similar fashion and added the data again. Each addition went smoothly. I was able to add the proper relationships quickly to tie everything together.
To finalize the model, I created a couple of hierarchies to make querying easier. After messing around with these several times in SSAS Multi-Dimensional, I was shocked at how easy it was to create a hierarchy for dates and even certain domain-specific entries in our data: Right-click, create new hierarchy, drag and drop columns. Done.
After the model was finished, I dropped the relevant fact data into a Pivot Table and could easily analyze the data in all sorts of interesting ways. I worked around this with OLAP for quite some time to come up with a similar concept. With PowerPivot, I wrote a couple of queries, linked the tables, and had something workable within an hour. As an added benefit, I know I can scale that up to an SSAS server without too much trouble and have a similar experience.
Hat tip to the folks @ www.powerpivotpro.com and www.sqlbi.com for the many articles and pointers to get me started and over that initial hump.
Friday, September 26, 2014
Exploring BIML
Getting Started
I attended Varigence’s BIML Workshop last year prior to the PASS conference and got to learn a little BIML while seeing what it can do. Recently, I realized that I had a job to do that required applying the same SSIS ETL to several different sets of tables in our system. I figured this would be a great time to put some of that BIML into use.
To get started, you’ll need either the BIDS Helper add-on or a copy of Varigence’s Mist. While Mist is an excellent tool, it’s also priced towards the people who are really working with BIML and not targeted at those who aren’t quite ready to justify the cost. ($249/month or $3999 at the time I’m writing this – worth it if you use a lot of BIML, but not for everyone)
I made sure my BIDS Helper bits were up to date, started a new SSIS Project, and added a new BIML file by right-clicking the project and selecting “Add New BIML File”. After that I followed a couple of excellent walkthroughs and tutorials to get me started and refresh my memory. To name a couple of sources:
- Andy Leonard’s Stairway to BIML articles at SQL Server Central
- Scott Currie’s Create and Load a Staging Environment from Scratch in an Hour With BIML
- Paul S. Water’s Getting Started and Basic BIMLScript Structure walkthroughs at the BimlScript.com site’s Walkthrough page.
- The helpful people who frequent the Varigence BIML forums
- David Stein (b | t) for letting me bounce some ideas off of him
Challenges
1. Actually getting my connection string correct. I don’t often type in connection strings so hit the ConnectionStrings.com website for SQL OLEDB connections. That told me to use this format for my connect strings:
Provider=SQLNCLI11;Server=myServerAddress;Database=myDataBase;
Trusted_Connection=yes;
Regardless of what I tried, this refused to connect as expected. I eventually changed back to the older syntax and that worked:
Provider=SQLNCLI11;Data Source=myServerAddress;Initial Catalog=myDataBase;Integrated Security=True;
Not a huge deal, but this was giving me fits with my connections with either security issues or having to specify DB Names in my code.
2. Next up – not knowing C# or BIML very well. Nothing for this except time and practice. :-) I got better along the way, slowly catching things as I typed them in and having some feel for what would happen.
3. For some reason, my Intellisense completely stopped working at one point. It worked for everything prior to when I added my “import” lines for various namespaces, but anything after that wouldn’t work at all. I got around this a little by using the BimlScript Editor online to check some code, but obviously this doesn’t work when you’re importing local columns. I never did get this working but also haven’t tried to reproduce it with a new project.
Update: It seems that if I move my @ Import statements before I open the first BIML tag in the XML, my intellisense works again. Putting the statements within the Biml section is perfectly valid, but putting them at the top/beginning of the file before I start writing Biml makes a lot more sense and results in Intellisense working again. This would have made my life easier.
4. The last major challenge this time through was trying to get BIML to generate a logical “AND” in the SSIS expression I wanted to use. I typed it in, used “&&” for my AND statement, and checked syntax. I repeatedly got errors about syntax. The first time around, I figured my expression might be too long. It was and I broke it apart only to get the same error. I looked at the string generated using some MessageBox.Show() hacks and it looked okay. I finally came across a tip indicating that BIML doesn’t work well with the “&” character. The suggestion was to replace that with “&” whenever you wanted to output an ampersand as part of the expression. This worked well and my package was generated.
Next
I still need to tweak my package. I have more tables to import from a different source on the same server, but targeting a single database on my destination server. I need to figure out tiers and included files to better reuse and organize code. I’d love to get the correct connections going so BIML and SSIS read the correct column data types, especially when it comes to BLOB data columns. I need to rework some of the package components to better adhere to best practices. Overall it’s been an interesting ride trying to figure these things out and while it may not have saved time with this particular project, I’ll be able to use the lessons learned for my next project.