Quantcast
Viewing all articles
Browse latest Browse all 310

Why does INSET NORMALTEST display missing (a period) on my histogram?

I have a simple data set that I'm trying to make a histogram of, but using the INSET NORMALTEST command doesn't seem to work for some data. This is the code so far:

Code:

DATA CLINIC;
  INPUT ID    $ 1-3
        GENDER $  4
        RACE  $  5
        HR      6-8
        SBP      9-11
        DBP    12-14
        N_PROC  15-16;
  AVE_BP = DBP + (SBP - DBP)/3;
DATALINES;
001MW08013008010
002FW08811007205
003MB05018810002
004FB  10806801
005MW06812208204
006FB101  07404
007FW07810406603
008MW04811207006
009FB07719011009
010FB06616410610
;

proc univariate data=clinic normal;
var sbp;
histogram sbp /normal;
inset normaltest = "norm" (4.2) /
    font = 'Arial'
    pos = nw
    height = 5;

This code works; however, when I use a different data set, it simply displays a period on the inset box next to "norm"

What characteristics of the data would lead to this? I can't post the actual data since it's confidential and too large, but what causes this? I really need to figure this out because the graphs need to have that data on them.

Viewing all articles
Browse latest Browse all 310

Trending Articles