where is this from?
- Log in to post comments
where is this from?
![]()
What characters are you seeing in parameters that you don't want to be there? How are the corresponding parameters defined?
Hi Jeffrey,
I think the best option here is to use the TAIL function. Please see the documentation for implementation details.
An example from the HoleFoods Sample:
Peter
I get an error when I try to post a race car emoji, so just image this comment is just a race car emoji
Hi Kevin,
This can be achieved by applying a dynamic filter spec to your cube/subject area. This can be done within the %OnGetFilterSpec method. Below you will find a sample that can be used against the HOLEFOODS Sample:
ClassMethod %OnGetFilterSpec(pFilterSpec As %String) As %String { Set pFilterSpec="" If $USERNAME="Peter" { Set pFilterSpec="[PRODUCT].[P1].[PRODUCT CATEGORY].&[Candy]" } ElseIf $USERNAME="Kevin" { Set pFilterSpec="[PRODUCT].[P1].[PRODUCT CATEGORY].&[Pasta]" } Quit pFilterSpec }
In this sample, when I log in, I will only see data related to Candy sales. When you log in, you will only see data related to Pasta sales. If anyone else logs in, they will see all categories.
Subject Areas are typically used on top of cubes when you want to filter the data like this. You can also do this directly on the cube, but it is easier to create multiple subject areas with different criteria if you stick to only using Subject Areas for this.
The logic and specs can be more complex if needed. I usually go to Analyzer and put together the necessary criteria and use the generated MDX directly or as a guide for my filter spec.
Peter
This is fixed in v1.1.3
Please feel free to create an Issue on GitHub. Please include the CSV file you are trying to use as well
Peter
Hi Lawrence,
This value does default to 2. If you see this value continuously defaulting to 0, please let us know and we can look into it.
Peter
Hi Lawrence,
In your Dashboard settings have you configured the "Work Lists" option? To get the filters and favorites, this will need to be set to 2. This can also be modified with URL parameters. Please see the documentation for parameter options and syntax.
Peter
Hi Semen,
You can accomplish this using Pivot Variables + Calculated Members.
You will need to define a pivot variable named "CommonDate" (or whatever, we will use "CommonDate" for this example). This Pivot Variable will contain values like "65211". The Patients Cube in SAMPLES contains an example YEAR Pivot Variable.
You will then define two calculated members:
1) Start Date = "[StartDate].[H1].[StartDay].&[$variable.CommonDate]"
2) End Date = "[EndDate].[H1].[EndDay].&[$variable.CommonDate]"
You can then create an "Apply Pivot Variable" control in your dashboard pointing to the "CommonDate" pivot variable. This date will then be applied to both dimensions even though they have different names.
Thank you for this quick reference table (and for my *looks up amount of points for comments* 30 points!)
How are you linking from one dashboard to another?
I do not currently have a screencast, but on the Community Channel there is the Flash Talk from Global Summit 2018 that can be viewed for now
This is the way I do it when I need to
Modifying your level definition in Architect to have a "Sort Order" or "desc numeric" will provide the sort that you desire. This will change the sort wherever this level is used though, not just for this one pivot.
You MIGHT also be able to create a new measure that can be used just for sorting this dimension, the measure could be based off of the $h value for the date. I have not tested this, but in theory it should work. You would also need to use the AVG value, since the SUM would not work for this
After selecting the Terminal tab (like you have done in the picture), you can click on the "Open a Terminal" button on the right side of the bar that contains the tabs. This will open a "Launch Terminal" popup. Mine defaults to "Telnet Terminal" for "Choose terminal". In here, I normally just type "localhost" into the "Host" field and leave the rest of the default values. After clicking OK, I am connected to my instance.
Note: Make sure %Service_Telnet is enabled
Hi Jaqueline,
You may also want to consider implementing %OnProcessFact (documentation link). This will allow you to run some code to determine if the record should be included or not. This can potentially be easier to implement some advanced logic rather than adding a build restriction. For many people, using the Build Restriction is enough though. As Evgeny says, this will add a WHERE clause to the SQL that is used to get the source records during a cube build.
Jon mentions that this will result in a Persistent class. Once you have this Persistent class, you can use it as the source for a DeepSee Cube. Once the DeepSee Cube is built, you will be able to use your data within Analyzer
To automate this with built-in DeepSee methods, please see documentation for %ExportPDFToFile. This can be used in a task and can be saved/emailed on a scheduled interval. Using other utility methods such as %GetMDXFromPivot can allow you to supply a list of pivot names and programmatically get the MDX to supply into the Export method.
Hi Damian,
I believe there are a few cases where this can happen - most of them involving the use of source control. Do you have source control enabled on all of your namespaces except ENSDEMO?
I would suggest opening a new WRC for this issue since there are a few possible causes and we may need to know a little bit more about your system.
DeepSee does not have anything built in to automate this behavior. Depending on how the relationship is defined, this could cause significant overhead (also if more than 1 relationship is defined, this grows even more). The recommended way of doing this is exactly as you have done.
An alternative approach is to implement %OnProcessFact and have it process any related cubes that need to be updated. The good thing about this approach is that it saves you time during your main application while saving records, but the downside is that it also requires CubeB to be synchronized at the same rate as CubeA.
Using the Cube Manager, you would be able to write some "pre-synchronize code" that would run queries to find records in CubeA that need to be updated and calling ##class(%DeepSee.Utils).%SetDSTimeIndex() here. This is quite similar to what it sounds like you are doing now, but this lets you avoid using triggers and building this process into the Synchronize itself.
This page should be helpful
Thanks for the input on this. I have considered these negative consequences as you describe. I am pulling my data from REST services, so it would be perfectly fine for me to delete all my data if need be. Perhaps once the early phases of development are finished these diffs will become less active and not a problem.
The second reason is that I have been developing on both Caché 2017.2 and InterSystems IRIS. The storage definition changes slightly based on which one I am using, so it creates a little bit of noise, but it is ultimately manageable. Perhaps doing development across products is not recommended, so that could also be the answer
Have you tested this and seen it work? I tried this before posting the question and it does not appear to work for me
If you use the custom listing in Analyzer and view the MDX that it generates, it will look something like this:
DRILLTHROUGH SELECT FROM [HOLEFOODS] RETURN Outlet->City "City",Comment "Comment"
You should be able to construct this however you want, assuming the return fields exist in your source (or fact table if you are using DRILLFACTS)
You can accomplish this by using pivot variables. By using this approach, the pivot variable can act as the Key for two different date dimensions. Example:
Pivot variable with value of &[2017].
[Date1].[H1].[Year].$variable.Year
[Date2].[H1].[Year].$variable.Year
On your dashboard, you would want to use the Apply Pivot Variable control instead of the Apply Filter control.
I vote for Tim's answer over mine
Would querying against %Dictionary.CompiledClass and adding a where clause on SqlSchemaName do what you need?
Brace always for me too, unless I am writing some quick debugging code or something along those lines