Sunday, January 26, 2014

CANADA AND POLITICAL POLLS-Sept. 3-4, 2014 / who's ur daddy- ur real daddy- how can 1,000 people reflect 36 million folks?- last Fed election taught us a lot-imho

SEPTEMBER 3-4  2014

 Everything you need to know about political polling

With every election – and increasingly often between elections – new polls come out frequently, purporting to show what percentage of the population say they support which candidates. Polls can be a useful tool to gauge how politicians and issues are faring in the public eye, but they aren't without their downsides. Can you trust poll numbers? And when and how should you take results with a grain of salt?
Here's a guide to understand how polls are done and how to interpret their results.

The Globe and Mail

Why do we poll people about their political preferences?

Polls serve a purpose. They allow the public to ‘speak truth to power’, putting the lie to political spin and keeping our political leaders accountable. Polls can be useful pieces of information that might help voters decide how to cast their ballots. As part of the process of deciding how to vote in an election, we canvas our friends and family, we listen to talk radio, or we read opinion pieces in the newspaper to get an idea of how a race is shaping up. Polls are no different – just more scientific.
As well, people are interested. Few political stories get as much traction as the results of the latest poll. People like to know the state of a political race and who might win, like spoilers for an upcoming comic book movie.
Newsrooms, advocacy groups, and political parties commission pollsters to conduct surveys for them, with the cost ranging from a few thousand to a few tens of thousands of dollars, depending on the type of poll.
Increasingly, though, polling companies conduct polls and hand them to the media for free, or publish them directly on their websites. This is done to promote the company’s services, as pollsters generally make the vast majority of their revenues from market research, rather than political polls.

istockphoto.com

What are respondents asked when they’re polled?

That depends on the survey. If it is a poll about voting intentions, the first question to be asked is usually whether you are eligible to vote. There’s no reason to poll people who can’t cast a ballot.
A properly constructed poll will often get right to the most important questions in order to prevent a bias from seeping in. If an election were held today, which party would you vote for? Some polls allow you to say the name of the party yourself, but most provide a list from which to choose, with the order of the parties being randomized with each call.
Other questions might revolve around approval ratings or specific political topics of interest. A poll then usually ends with demographic questions related to age, gender, income, and education. This helps calibrate the poll to get the sample to reflect the general population.

How do pollsters reach respondents?

There are three methods that are used by most pollsters today.
One is the oldest method still in use: via the telephone. Numbers are randomly dialed within a given area code, and a live interviewer is on the other end of the line to conduct the poll. These interviewers follow a precise script to ensure that every call is conducted in the same manner. If your number was dialed but you weren’t home, a proper poll will try several times to reach you over the next hours or days. Otherwise, the sample might be skewed by interviewing only people who were at home at 7 p.m. on a Tuesday night. Maybe these people are different than those who are home on Wednesdays.
Another method also uses the telephone, but instead of a live interviewer the call is automated. These are known as ‘robo-polls’. A recorded script is played, and respondents are asked to punch in their responses using the telephone keypad. Press 1 for Conservative, press 2 for Liberal, etc. The advantage of this method is that many calls can be conducted quickly and cheaply – you don’t have to train and pay a team of interviewers to get the results.
The last method that is becoming increasingly ubiquitous is via the internet. Polling firms assemble a panel of internet users, often numbering in the hundreds of thousands. These panelists can be recruited in various ways, including through internet advertisements and over the telephone.
Once the panel is built, pollsters then survey among the members of that panel, ensuring that those who complete the survey are broadly reflective of the target population.

photos.com

What error is there in the numbers?

There are a number of sources of error, the most important one being sampling error. No matter how perfectly a poll is conducted, there will always be a degree of error associated when sampling a small portion of a large population. This is reported as the ‘margin of error’, and in a standard poll of 1,000 people that margin of error is plus or minus 3.1 per cent, 19 times out of 20. That means that if the poll had been conducted in exactly the same way 20 times, in 19 of those cases the results would be within 3.1 points of actual public opinion.
This assumes, however, that the sample was drawn randomly and that everyone in the target population has an equal chance of being interviewed. This is why telephone polling can still carry a margin of error – virtually everyone has a telephone, be it a landline or mobile phone (and yes, most pollsters do sample cell phones). But not everyone will respond. Response rates have dropped to 10 per cent or less, from roughly 1-in-3 in the past. This might have an important effect on the accuracy of a poll, though there is some debate in the industry about whether or not this effect is significant.
Internet polls, as they survey a subset of the population that is a member of a panel, are not supposed to carry margins of error, at least according to the main industry bodies in Canada and the United States. There are still errors associated with these polls, however, but they are not supposed to be measured in the same way as a randomized telephone poll. Nevertheless, internet polls are designed to be as accurate as their telephone counterparts, so should be expected to perform as well. And they usually do.

But how can 1,000 people reflect the opinions of a country with a population of 35 million?

It might be hard to believe, but it is mathematically possible. A smaller sample will, of course, have a harder time reflecting the population accurately. But a poll of 1,000 people is generally considered the standard size. Larger polls have smaller margins of error, but the return on that extra effort is smaller is well. Doubling the sample size does not cut the margin of error in half, for example.
One common way to explain how sampling works is to imagine a pot of soup. The pot of soup contains a large number of different ingredients mixed together. That is our target population. If you dip a spoon into the pot and taste it, the spoonful will likely taste like the rest of the pot of soup. That’s our poll.
You don’t need to eat the entire pot to know what it tastes like. The odds of getting a spoonful that is completely unrepresentative of the entire pot of soup is low - and it is the same with polling samples. If the pot of soup has been mixed together properly (or if a sample is collected randomly), a small sample of it should be reflective of the entire pot (or the entire population).

B.C. Premier Christy Clark, on election night May 14, 2013. (The Canadian Press)

How accurate have polls been in predicting election results?

They can do very well, and in most elections they reflect the outcome quite closely. There have been some notable misses, of course, such as the provincial elections in Alberta in 2012 and British Columbia in 2013. But while the polls might have missed those outcomes, they did choose the winner in the most recent provincial elections in Saskatchewan, Manitoba, Ontario, Quebec, New Brunswick, Nova Scotia, Prince Edward Island, Newfoundland and Labrador, and federally. Cases like Alberta and British Columbia are so notorious because they are rare.
And there were some reasons for those misses. In Alberta, most pollsters stopped polling almost a week before the election, and so may have missed a late shift in voting intentions. In British Columbia, there was great difficulty in modelling the voting population.
There are degrees of accuracy as well. The polls in the last provincial election in Ontario were mixed, though most had the Liberals winning – just not necessarily with a majority. In the 2011 federal election, every poll at the end of the campaign put the Conservatives in first by a comfortable margin and the NDP in second (and about to make a huge breakthrough in Quebec). No poll gave a strong indication, however, that the Conservatives would win a majority government.
In other cases, however, the polls can do remarkably well. This was the case in the last elections in Nova Scotia and Quebec.
It is impossible to know beforehand which elections will be polled well and which will not. Often there are reasons intrinsic to each campaign that can help or hinder the accuracy of the polls. But misses like Alberta and B.C. are very rare – only marginal errors should be expected in most cases.

Who are the polling firms that work in Canada?

Canada has a limited number of national firms, as well as some firms that do regional polling.
Pollsters that conduct national polls include Abacas Data, Angus Reid Global, EKOS Research, Forum Research, Harris-Decima (now part of Nielsen), Ipsos Reid, Léger and Nanos Research. Infrequently, other polling firms release national numbers as well.
At the regional level, there is, among others, Insights West and the Mustel Group in British Columbia, ThinkHQ in Alberta, Insightrix in Saskatchewan, Probe Research in Manitoba, CROP in Quebec, and the Corporate Research Associates in Atlantic Canada.

Produced by digital politics editor Chris Hannay. Graphic by Murat Yukselir.

Follow us on Twitter: @GlobePolitics

http://www.theglobeandmail.com/news/politics/everything-you-need-to-know-about-political-polling/article20313717/
--------------


CBC AUDIENCE SHARE- SUPPERTIME NEWS TUMBLES BY 45%- Alan White (so highest no. 70,000 people out of 36 Million is now.... well .... there u have it...)


Audience share for CBC's supper news tumbles by 45% by Alan ...
www.friends.ca/news-item/1500 Cached
Audience share for CBC's supper news tumbles by 45% ... the CBC news program that held a 59 per cent audience share in ... Top five Canadian political ads of 2013 ...

AND... CTV

CTV Rakes in Viewers as SAVING HOPE is Once Again Thursday's Top Drama With 1.3 Million Viewers

-----------------

political polls

1. accuracy- who u gonna call? and what do u want 4 an answer

2. how u going 2 prove a fair poll

3. Just the facts - unblemished- do u want the poll 2 be fair or politically favoured?

4. Unequal distribution of questions???

5. Media bias- lies and more lies- know which media favours  which party

---------------


How to Conduct Election Poll Surveys

By an eHow Contributor
  

Polls are an important way to gauge how an election will turn out before votes are counted. Done correctly, polls provide an accurate representation of the way people are voting. There are a few general guidelines to follow to ensure accuracy when you conduct election poll surveys.
Other People Are Reading
 How to Design Polling Questions How Are Political Polls Done?

Instructions
Learn to Conduct Election Poll Surveys

1
Decide on the form of media that will reach the biggest and most diverse audience. If you have connections to a newspaper, television station or radio show, then you may be able to persuade them to carry your poll idea. If you have a website or blog with significant traffic, consider placing the poll on your site.

2
Tailor your poll question to the current political atmosphere. For example, you don't want to ask how voters will vote in the general election if the primary elections haven't happened yet. Also consider restricting your poll to one political party.

Sponsored Links

Automation Software
Automate any task or application No Programming. Try it Now!
www.automationanywhere.com
3
Keep your poll open for a short amount of time and let your audience know how long it'll be open. A day is usually a long enough time for news media-based polls, but if you're polling through the mail or telephone, you may want up to a week to ensure a sizable polling base.

4
Close the polls and tally up the results. You should take steps to report the results to the people you polled, although this may not be possible in the case of a telephone poll.

5
Analyze your respondents to determine if your poll is project-able. If you made efforts to represent all the demographics in the area of your poll, you may be able to say that the results would stay the same if the poll included every person in that area.

6
Start planning your next poll. If you want to build credibility for your election poll surveys, you'll need to continue to poll the population as the election cycle moves forward. Change your polling questions to reflect how the political landscape develops.



Read more: http://www.ehow.com/how_2061789_conduct-election-poll-surveys.html#ixzz2rV2WrdOe









Tips & Warnings
There are telephone-polling companies that can help you reach a sizable number of people in an area, although these companies may charge high fees.

Some Internet polling software can allow blocking of an IP address after it votes. This method is a safeguard against repeat voting that can skew your results.

If you want to conduct an exit poll, be sure to ask the state's Election Commission about any restrictions on approaching people after they have voted.



Read more: http://www.ehow.com/how_2061789_conduct-election-poll-surveys.html#ixzz2rV2QgPyD


----------------




How to Organize Your Survey for Best Results

Posted on November 7, 2012by Ivana Taylor  2 Comments ?   


How often have you been asked to create a survey and then been completely overwhelmed by the process?

It’s not unusual especially because so many people put their focus on the questions rather then the ultimate result they want.  Here is a quick primer on overcoming the overwhelm of putting a survey together.

BEFORE you create the survey

There are three BIG questions to answer for yourself before you sit down to design your survey:
1.What decision are you trying to make?
2.What criteria are you using to make your decision?
3.Whose opinion do you need to make that decision?

If people have already thrown you a list of questions, put them aside and answer these three questions first.

Your next step is to take any questions you’ve already been given or come up with and match them to the three questions above.  You can use the following bullet points to help you organize them:
In what ways will the answer to this question help me make a decision?
Does this question need to be restated or asked another way so that it will better help me make a decision?
In what ways can I ask this question so that I can make a good decision?
Who should answer this question?
Whose answer to this question will be irrelevant or unnecessary?
Does this question address any of the criteria that are critical to my making a decision?
In what ways do I need to reword this question so that it is useful?

Where to begin

Put yourself in your recipients’ shoes. (do it now — take a moment to picture your online survey recipient) — Got it?  Great.  Let’s move on.

Write an engaging and friendly introduction.    When you start a survey on QuestionPro and click on “Introduction”, you’ll see that we’ve written one for you.  DON’T USE IT!  Use it as an inspiration, but don’t use it as is- it’s just there to show you what an introduction looks like.  You’ll want to write one in your own voice.  Here are some tips:
Make it friendly and engaging
Explain in the simplest terms what your survey is about and why your respondents’ opinion is important.  Don’t just tell them it’s important – tell them how their honest response will benefit THEM.
Keep it short and simple.

Qualifying Questions FIRST

If you answered the goals and objectives questions I outlined above this sections should be easy.  The first thing you want to do is qualify your respondents – make sure that you’re not wasting people’s time in filling out the survey.

If you’re clear about your respondent profile – then just ask those questions upfront:
Do you own your home (yes/no) — if you need people who are home owners and your respondent doesn’t own their home, you can take them directly to the “Thank You” page and save them the time in taking the survey.

These are typically closed-ended yes/no questions.  That quickly focus on on exactly the person you want to answer your survey.

If I could only know ONE thing?

The next thing you want to do is prioritize your questions.  If you can only know one thing — what is it.

If your respondent starts your survey and only gets to answer the first 3 questions — what should they be?  This is insanely important because if your survey takes longer than one minute — this is very likely to happen.  So you want to take a look at all your questions and group them in order of importance.

Every time you add a question to your survey — ask yourself again “If I could only know ONE thing…?”  then choose the next question and add it on the survey.

Most people will put demographic information at the end of the survey — you don’t have to do that.  If one of your primary objectives is to categorize your respondents and get them into groups — then put that demographic info up front.  There is no rule about what goes where.

General survey success tips
Keep the number of questions short.  People really don’t want to spend more than 3 minutes on a survey.  Make sure your questions are easy enough and short enough to get your respondents in and out of the survey quickly.  Typically less than ten easy questions.  Ideally five.
Keep open ended questions to a minimum.  How many times have you clicked through surveys and skipped the open ends?  EXACTLY — no one wants to write their answers out.  AND remember many of your respondents will probably be taking their surveys on a mobile device (killing time perhaps) so make it easy for them to get through the survey just clicking.
Use the page break option after every question.  This isn’t always possible, but respondents love the feeling of moving along – as long as your survey is short.
Use our cool finish options creatively.  QuestionPro has this very nifty feature called Finish Options — where you can literally drive your respondents wherever you like.  For example, you can show them a “Spotlight report” so that they can see how other people answered the same questions.  You can forward them to a custom URL or domain where they can download a gift or actually buy something.  You can also send them a Thank you email.  The possibilities are endless – so be creative.

The bottom line is this.  Don’t just jump into programming a survey.  How you design your survey can have a huge impact on how many people respond and answer your questions and the overall quality of your data.  So take the time to do it right and you’ll reap the rewards.


http://blog.questionpro.com/2012/11/07/how-to-organize-your-survey-for-best-results/



---------

imho-OKAY FOLKS.... Canada has 36 million people.....

-at any high time - perhaps less than 50,000 people watch news -because the net has it all...anywhere- anytime...imho

-polling  out of 36 million - 1,996 (and those of us who have adult education certificates know- u can make any survey say what u want... it's how and who u ask.... so come on)  people?  cell phones.... land l ines??? Who's ur daddy? ur real daddy?

CANADA'S LAST FED ELECTION-  all the polls crashed badly.... everyday folks did their thing... and all the media bias 4got 2 actually go where the people who vote were.... 
- Conservatives 164 seats
   Opposition- NDP 100 seats

   Liberals... ahem... 34 seats
  and the rest don't count


- frankly, don't u think it's time that all elected are $$$accountable and actually did what they were elected 2 do- work 4 us.... 2gether...

The poll was conducted online in both English and French between Jan. 14-18, 2014. Results are based on responses from 1,996 people. There is no margin of error for the survey since the selection of respondents was not random.


---------------

3 Steps to Organizing a Successful Survey - Lesson Learned

 By     Tim Harper 

 



Introduction: Lessons Learned

When it comes to surveys, there is more involved than merely jotting down a few simple questions and asking random people. There are three basic but mandatory steps involved in organizing a successful survey: planning, composing, and running. In order to maximize the response rate and usefulness of responses, careful planning combined with the proper knowledge is a necessity.

The Lessons Learned white paper is the result of in-depth data harvesting. It reflects a generalization of how people have responded in the past and can be used as a guide while you create your survey. Throughout the document, each step is addressed individually with detail about the actions necessary to complete each step.

Part I: Planning your Survey

In the construction industry, successfully building a safe, reliable, and sturdy structure requires that blueprints always be designed first. The same applies to building surveys. Before you begin writing questions, create your own "blueprint". Write down the goals you wish to accomplish by running this survey. Then, decide on the type of survey you wish to create. Finally, decide on the degree of anonymity you wish to maintain. Successful planning is a necessary step for a successful survey.

First, write down your goals. By writing down your goals, you have a defined point of focus. Sticking to your goals while you write your survey will help keep your survey more refined. Also, it will help you to obtain an optimal length and avoid including things that don't necessarily relate to what you're trying to accomplish.

Next, you must choose which type of survey is right for your needs. There are three main types of surveys: Knowledge Acquisition, 360, and Team Rater. Which one is right for your needs depends on your goals.

Knowledge Acquisition surveys are polls. They involve harvesting the opinion about an item, and are very one-dimensional. If your goal is to obtain the opinion about an event, goal, idea, etc., then a general Knowledge Acquisition survey is right for you.

A 360 survey is designed to help staff members gain a better understanding on how they perform from multiple perspectives. It involves 2 groups of people: participants and respondents. First, the participant completes a self-evaluation; then chooses people that he/she feels could provide useful feedback. Those people, called respondents, complete the survey giving feedback to the participant. If your goal fits with "I would like to help individuals gain a better understanding of their performance from multiple perspectives and how it compares to their self-evaluation," then perhaps a 360 survey is right for you.

The Team Rater is like a 360, only it is targeted to a team, a group of people who work together. It is designed to help gain understanding on how team members compliment each other. Each team member completes a survey for every other team member, but unlike the 360, they don't complete a self-survey. If your goal fits with, "I would like to help a team better understand how they compliment each other, giving them knowledge of areas where they can improve", then the Team rater is a great solution for you.

Finally, you need to consider how anonymous you would like the respondents of your survey to be. Anonymity is important because it greatly reduces bias. People are instinctively concerned with what others may think of the feedback they give, so they pad their response to conform to expectations. When people know that their responses are going to be kept anonymous, they are more likely to give honest feedback. Keeping the results anonymous will enhance the quality of the feedback received.

Keep in mind, however, there are certain circumstances where anonymity is not possible and can be dismissed. In the case of a 360 survey, for example, anonymity from the manager is sometimes impossible as some individuals only have one manager. In this case, it is OK to have the manager's response not anonymous; it is generally a manager's job to communicate the performance of the individual he or she is manager over. However, you should still maintain anonymity for other relationships in a 360 survey.

Part II: Composing your survey

Now that you've properly planned for your survey, it's time to get down to business and compose it. To compose a survey, you need not only to write questions, you also need to craft effective invitation and reminder messages, as well as a welcome and a finished message. If you are running a 360 survey, you'll need two versions of each: one for the participant, and one for the respondent (the person giving feedback for the participant). To ensure these items will obtain optimal results, there are several critical elements to be considered.

When composing your questions, compose them as if you were asking them face-to-face. Additionally, you should always follow these important tips.

•To preserve anonymity, avoid demographic questions that will reveal the identity of the respondents.
•Avoid acronyms, jargon, and abbreviations. If your audience doesn't understand the question, they can't respond to it correctly.
•Avoid ambiguous questions and or vague words that have multiple meanings.
•Write clear, specific questions that are to the point, discarding unnecessary words. For example, don't write questions that contain multiple thoughts. Instead of "Does he perform well on the job and promptly meet deadlines", break it down to two questions: "Does he perform well on the job" and "Does he promptly meet deadlines".
•Avoid questions with an unbalanced response scale. For example, rather than the scale "Disagree; Agree; Strongly agree", use "Strongly disagree; disagree; Neither disagree nor agree; Agree; Strongly agree".
•Questions should never contain double negatives. Instead of "His comments in meetings aren't unnecessary" use "His comments in meetings contribute to the agenda".
•Get to the point and stay with it. Write questions that are relevant to your goal. It is important to maximize the usefulness of your audience's time, your audience should be able to clearly see an association between the purpose of the survey and the questions asked.
•Keep the survey short. Try to keep it between 25-40 questions, taking no longer than 20 minutes to complete.
Another thing to consider is the order that your questions are asked in. You may wish to randomize them to help eliminate bias. One advantage of Internet-based surveys is that vendors such as LearningBridge can do this automatically for you-each participant will see the questions in a random order, something not possible to do in paper-based surveys.

When composing your invitation and reminder messages, welcome and finished screens, there are some important elements to consider. Each message should be informative, short, and personal. Paying attention to these things will contribute to the success of your survey.

Invitations and Reminders should include the following:

•E-mail Subject Line - This should contain the purpose of the e-mail summarized to a few words. To create a sense of urgency, use the words RESPONSE REQUESTED or FEEDBACK REQUIRED. Putting words in capital letters helps them stand out more
•From-Address - You can specify the e-mail address the invitations come from. The respondent e-mail invitation is more likely to be read if it is from someone that the respondent knows, rather than from the online survey vendor.
•Deadline - The survey deadline should be near the top of the e-mail body, people will be more likely to see it. Additionally, putting 'YOUR FEEDBACK MUST BE SUBMITTED BY' in all capitals will help the deadline stand out more.
•Anticipated time to take survey - It is courteous to predict how much time the survey will take. If your survey involves lots of typing (comment responses), you may wish to give audience estimated time to complete multiple choice answers and direct answers only, and inform them that the comments will take longer.
•Purpose of Survey - Communicate to your audience what the feedback will be used for, and how it will affect them. For example, "This feedback will be used to help evaluate our team efficiency and will not affect your grade." If this invitation is for a 360 and directed to a Respondent (somebody giving feedback to a Participant), then you should include "You are providing feedback for [ParticipantName]."
•Anonymity - Briefly tell your audience if the responses are anonymous. If there are circumstances where they are not anonymous, it is important to tell your audience before hand where their answers will not be held anonymous.
•Include Instructions - In 360's, instruct your participants to complete the self-survey before adding respondents (people who give feedback to participants). This allows the participant to be familiar with the content of the survey before deciding whom he or she will invite to give feedback.
•How to access survey - Tell them where to go and what to do.
Next, compose the welcome screen and finished screen. The welcome screen should just welcome them to the survey and give a brief insight to the survey. Also, if applicable, survey instructions or definitions of terms could be included on this page.

The finished screen should express appreciation for their feedback. If applicable, inform them when they'll get there results, and include any additional instructions for them here.

It is important to keep the messages personal. People are much more responsive to messages that are personalized. Consequently, use gender specific pronouns (he/she) and names wherever applicable.

Keeping the messages short should be another goal when designing custom messages. People tend to skim through messages and miss important information if they are long.

Part III: Starting your Survey

All right, you've got this beautifully composed survey that has been planned from start to finish, you've carefully written the questions, and you're probably excited to get it running. Now, all you have left to do is to set an invitation date, a deadline date, set some reminder dates (or an interval, if you prefer), and invite the right people to come take your survey. Sounds easy, right? It is, but there's a little science behind getting a good response rate.

First, let's begin with setting an invitation date. A common misconception in the survey world is that Mondays are a terrible day for to send out invitations. The reason behind the misconception being people are just getting back from a 2-day break from work, they have a million messages and things to do to catch up. Consequently, they are far too busy to pay attention to a little invitation in their inbox inviting them to give feedback.

You might also look at it the other way around -There are equally as many people that are just getting to work from a break after they've put their projects to a stopping point the Friday before. They're refreshed, ready to go, and have time to deal with other things before they pick up their major projects again. However, be sure to use your own judgment-there may be other circumstances of your target audience that you may wish to take in to consideration, such as a major project deadline coming up.

You should always set a deadline before you start a survey, simply because a survey without a preset deadline will never come to a close. If you want to wait until you get 100% response rate, you'll miss out on valuable weeks where you could be putting that data to use. On top of that, procrastinators will wait until just before the deadline to do the survey, so by setting an earlier deadline you'll get a higher response rate. 2 ½ weeks to 4 weeks has proven to be plenty of time to get a good response rate. Also, closing on a Wednesday is beneficial because it closes after the two most responsive days, Monday and Tuesday.

One trick that has proven to be effective in the past is to set the deadline a week earlier than you need it to be. Then, if you don't have enough data by the time that deadline approaches, extend it by a week. By doing this you give the procrastinators another chance to do the survey, and they will usually come around the second time.

E-mail reminders have also proven to be an effective tool in stimulating response. The truth is, the first time people are told to do something, they often put if off because they're busy with other things. Next, they forget about it. In most cases, they'll need to be reminded a few times before they do it.

When scheduling your reminder dates, you have two options of how to go about it. You can use set reminder dates-specified dates that reminders will go out. Another way is to set a reminder interval, so the user is reminded every X number of days after he or she received the invitation to complete the survey. Each has its advantages.

If you choose to set specified reminder dates, space your reminder dates at least three days apart, and schedule them to remind on alternate days of the week (e.g., remind Thursday, Monday, Friday, then Tuesday). Also, avoid sending out reminders on Saturday and Sunday, as these are the least responsive days. The advantage to using specified reminder dates is that you have more control over the survey. Set reminder dates work well for Knowledge Acquisition and Team Rater surveys because all participants and respondents in the survey are usually added on the same day. In a 360 survey, where respondents are added over several days, set dates do not work as well. For example, a respondent might be invited to complete a survey on a Wednesday and then receive a reminder message the very next day, just because that happened to be the set reminder date.

If you choose to use a reminder interval, it should be between three to six days. Five days is an excellent choice and has proven to be work well in the past. Reminder intervals are nice because they're easier to set up and the reminders are set out in relation to the day the respondent was invited. The disadvantage is you don't have as much control.

Finally, invite your audience. The invitations are sent out as soon as you enter an email address, so enter the names and e-mail addresses of the people you wish to respond the night before you want them to get their invitations. This way, they'll get them in the morning, while they are starting their day.

Now that you've set an invitation date, a deadline date, reminder dates or an reminder interval, and have invited your audience, sit back and watch the survey run. You'll get your results a few days after the survey closes.

Conclusion: Interpreting Your Results

When you get the results from your survey, you should use the data as a guide to making decisions. Don't let the data make decisions for you. It is important to keep in mind that the survey process is not perfect, people may not tell the whole truth and survey bias may be in play. You should always temper decisions with common sense and experience.

Also, you should consider running the same survey multiple times. Running a single survey will give you a snapshot of what you are looking for, but running the survey again will give you a historical capture of what your looking for. Being able to see a history of survey results will enable you to see how the actions taken have affected the results.

To maximize the response rate and usefulness of responses, careful planning combined with the proper knowledge is a necessity. We hope this document has answered your questions about where to begin with starting a survey, and has given you the guidance and knowledge to create a successful survey.

http://ezinearticles.com/?3-Steps-to-Organizing-a-Successful-Survey---Lesson-Learned&id=3572132

--------------


How to Develop a Research Questionnaire
http://www.wikihow.com/Develop-a-Research-Questionnaire


How to Develop a Research Questionnaire: 8 Steps (with Pictures)

www.wikihow.com/Develop-a-Research-Questionnaire   Cached
How to Develop a Research Questionnaire. A research questionnaire, also called a survey, ... Organize the survey questionnaire in a logical way

-------------

A GOOD REALISTIC ARTICLE ABOUT POLLS IMHO...


If it's in the News, it's in our Polls. Public opinion polling since 2003.


67% of Political Class Say U.S. Heading in Right Direction, 84% of Mainstream Disagrees
in PoliticsEmail thisShareThis
Tuesday, August 03, 2010

Recent polling has shown huge gaps between the Political Class and Mainstream Americans on issues ranging from immigration to health care to the virtues of free markets.

The gap is just as big when it comes to the traditional right direction/wrong track polling question.

A Rasmussen Reports national telephone survey shows that 67% of Political Class voters believe the United States is generally heading in the right direction. However, things look a lot different to Mainstream Americans. Among these voters, 84% say the country has gotten off on the wrong track.

Twenty-four percent (24%) of Mainstream voters consider fiscal policy issues such as taxes and government spending to be the most important issue facing the nation today. Just two percent (2%) of Political Class voters agree.

With a gap that wide, it’s not surprising that 68% of voters believe the Political Class doesn’t care what most Americans think.  Fifty-nine percent (59%) are embarrassed by the behavior of the Political Class.

Just 23% believe the federal government today has the consent of the governed.

Most voters believe that cutting government spending and reducing deficits is good for the economy. The only group that disagrees is America’s Political Class. In addition to the policy implications, this highlights an interesting dilemma when it comes to interpreting polling data based upon questions that make sense only to the Political Class. After all, if someone believes spending cuts are good for the economy, how can they answer a question giving them a choice between spending cuts and helping the economy?

Mainstream Americans tend to trust the wisdom of the crowd more than their political leaders and are skeptical of both big government and big business.

Fifty-eight percent (58%) of voters currently hold Mainstream views. In January, 65% of voters held Mainstream views. In March 2009, just 55% held such views.

Only six percent (6%) now support the Political Class. These voters tend to trust political leaders more than the public at large and are far less skeptical about government.

When leaners are included, 76% are in the Mainstream category, and 14% support the Political Class.

“The American people don’t want to be governed from the left, the right or the center. The American people want to govern themselves," says Scott Rasmussen, president of Rasmussen Reports. “The American attachment to self-governance runs deep. It is one of our nation’s cherished core values and an important part of our cultural DNA.”

In his new book, In Search of Self-Governance, Rasmussen explains, ““In the clique that revolves around Washington, DC, and Wall Street, our treasured heritage has been diminished almost beyond recognition. In that world, some see self-governance as little more than allowing voters to choose which of two politicians will rule over them. Others in that elite environment are even more brazen and see self-governance as a problem to be overcome.”

The book can be ordered on the Rasmussen Reports site or at Amazon.com.

The Political Class Index is based on three questions. All three clearly address populist tendencies and perspectives, all three have strong public support, and, for all three questions, the populist perspective is shared by a majority of Democrats, Republicans and those not affiliated with either of the major parties. We have asked the questions before, and the results change little whether Republicans or Democrats are in charge of the government.

In many cases, the gap between the Mainstream view and the Political Class is larger than the gap between Mainstream Republicans and Mainstream Democrats.

The questions used to calculate the Index are:

-- Generally speaking, when it comes to important national issues, whose judgment do you trust more - the American people or America’s political leaders?

-- Some people believe that the federal government has become a special interest group that looks out primarily for its own interests. Has the federal government become a special interest group?

-- Do government and big business often work together in ways that hurt consumers and investors?

To create a scale, each response earns a plus 1 for the populist answer, a minus 1 for the political class answer, and a 0 for not sure.

Those who score 2 or higher are considered a populist or part of the Mainstream. Those who score -2 or lower are considered to be aligned with the Political Class. Those who score +1 or -1 are considered leaners in one direction or the other.

In practical terms, if someone is classified with the Mainstream, they agree with the Mainstream view on at least two of the three questions and don’t agree with the Political Class on any.

Initially, Rasmussen Reports labeled the groups Populist and Political Class. However, despite the many news stories referring to populist anger over bailouts and other government actions, the labels created confusion for some. In particular, some equated populist attitudes with the views of the late-19th century Populist Party. To avoid that confusion and since a majority clearly hold skeptical views about the ruling elites, we now label the groups Mainstream and Political Class.

Please sign up for the Rasmussen Reports daily e-mail update (it’s free) or follow us on Twitter or Facebook. Let us keep you up to date with the latest public opinion news.

Rasmussen Reports is a media company specializing in the collection, publication and distribution of public opinion information.

We conduct public opinion polls on a variety of topics to inform our audience on events in the news and other topics of interest. To ensure editorial control and independence, we pay for the polls ourselves and generate revenue through the sale of subscriptions, sponsorships, and advertising. Nightly polling on politics, business and lifestyle topics provides the content to update the Rasmussen Reports web site many times each day. If it's in the news, it's in our polls. Additionally, the data drives a daily update newsletter and various media outlets across the country.
Some information, including the Rasmussen Reports daily Presidential Tracking Poll and commentaries are available for free to the general public. Subscriptions are available for $3.95 a month or 34.95 a year that provide subscribers with exclusive access to more than 20 stories per week on upcoming elections, consumer confidence, and issues that affect us all. For those who are really into the numbers, Platinum Members can review demographic crosstabs and a full history of our data.
To learn more about our methodology, click here.
http://www.rasmussenreports.com/public_content/politics/general_politics/august_2010/67_of_political_class_say_u_s_heading_in_right_direction_84_of_mainstream_disagrees

---------------------------



Why telephone polling used to be the best and why it’s dying out
ÉRIC GRENIER
Special to The Globe and Mail
Published Thursday, Jul. 25 2013, 10:36 AM EDT
http://www.theglobeandmail.com/news/politics/why-telephone-polling-used-to-be-the-best-and-why-its-dying-out/article13417520/

-----------

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.